backtop


Print 48 comment(s) - last by JumpingJack.. on Feb 21 at 2:39 AM


Despite never being battle tested, Intel claims its smart phone processors will be the most powerful on the planet.  (Source: Funimation/Toei)
Hardware giant says that its upcoming Atom-based smart phone processor will crush ARM chips in power, performance

Intel certainly is taking a bold, if dangerous position.  Without having shipped a single smart phone system on a chip, it's claiming that its first generation smart phone chips will beat ARM designs when Intel launches the chips to the market later this year.

Speaking at the Mobile World Conference 2011 in Barcelona, Spain Anand Chandrasekher, senior vice president and general manager of Intel's Ultra Mobility Group, made this bold prediction.  He concedes that the upcoming smart phone core, dubbed Medfield, will only tie ARM cores in standby time.  But he claims it will blow away the competition in the amount of time the phone can remain active and how fast it can perform processing.

That seems a bit overly optimistic, given that Intel is only in its first generation, while most ARM CPU makers are well into their second or third generation.

The good news for Intel, though, is that at least it appears like it will be delivering a product sometime soon.  It says [press release] it is currently producing the smart phone chips, which should be due in products late this year.  

The first generation chips come with an HSPA+ modem, courtesy of the technology that Intel acquired from its $1.4B USD purchase of Infineon.  While ARM processors with on-chip LTE modems should be available near the start of next year, Intel's LTE-ready chips won't arrive until holiday season 2012.  

The most immediate problem (other than living up to its huge claims) facing Intel is convincing hardware manufacturers to embrace Medfield.  If the performance is as Earth-shaking as Intel claims that shouldn't be too hard to do -- but if the performance is closer to what one would expect in reality, it may be an uphill battle for Intel.  So far only LG has showed off an Atom-based smart phone prototype.  No hardware partners have been announced, though Intel claims it should begin shipping product later this year.

The big issue facing Intel, though, is that if it isn't able to live up to its boastful claims and doesn't take the fight to ARM, ARM will likely take the fight to it.  Qualcomm has already aired a quad-core 2.5 GHz ARM chip that will be available next year.  With Windows 8 set to support ARM-based PCs, Intel could be in a world of hurt in the power-conscience laptop market.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Good article except for the drama
By DallasTexas on 2/15/2011 12:59:01 PM , Rating: 3
Now for the facts...
* Intel never said crush the competition
* Intel did say it will tie ARM cores in standby time which looks pretty good given that in fact it will blow away the competition in performance.
* The x86 architecture TODAY blows away the competition in performance. The problem is that it is not found in any phones because the power is too high.

Summary
So, there you have it. X86 finally meets market requirements in low power and CONTINUES to blow away the competition.

In the mean time, the ARM architecture needs to add cores (and higher power) to raise the performance requirements while Intel follows Moore's law and uses their process technology to reduce power.

Who has the better chip? I suppose the one that is shipping today. Who has the better architecture? Hmm, not so fast.




RE: Good article except for the drama
By mcnabney on 2/15/2011 1:30:22 PM , Rating: 4
There is something missing from those assertions.

The new Atom could equal ARM for standby time, beat it handily in performance, but still be an unwanted product.

The remaining question was never asked/answered.

What kind of power does it draw when in use?

If the handset can only run applications for 45 minutes before draining the battery this chip is a non-starter.


RE: Good article except for the drama
By SirKronan on 2/15/2011 11:45:44 PM , Rating: 3
quote:
The remaining question was never asked/answered. What kind of power does it draw when in use?

The remaining question was never asked/answered.


Yes they did, according to the article:

quote:
But he claims it will blow away the competition in the amount of time the phone can remain active and how fast it can perform processing.


RE: Good article except for the drama
By gamerk2 on 2/16/2011 9:01:37 AM , Rating: 4
AMD claims Bulldozer is 50% faster then Sandy Bridge.

Never trust the PR guys.


By silverblue on 2/16/2011 10:53:26 AM , Rating: 3
The original quote was 50% faster than i7-950.

Never trust Chinese whispers. ;)


RE: Good article except for the drama
By Einy0 on 2/16/2011 10:59:23 AM , Rating: 2
Where did you see AMD PR make that claim? Maybe AMD PR said in X type of system load we beleive we'll have a 50% performance advantage. Perhaps AMD said Bulldozer will have a 50% performance advantage per dollar spent on the CPU. I think after the K10 issue(TLB error & much lower performance than expected.) AMD has calmed down about big claims.


By CrazyBernie on 2/17/2011 11:14:31 AM , Rating: 2
quote:
Yes they did, according to the article:

Actually, there's an "and" that cuts the two claims in half. So no... it doesn't with 100% confidence answer the question.

quote:
But he claims it will blow away the competition in the amount of time the phone can remain active and how fast it can perform processing.


"Remaining active" and "Drain under load" are two different beasts.


RE: Good article except for the drama
By MozeeToby on 2/15/2011 1:32:39 PM , Rating: 3
As far as I know, no one outside of Intel has seen these chips to do any independent tests. Not to say it isn't true, just that it needs to be taken with a big grain of salt. I don't doubt their power draw has improved dramatically though, they've learned all kinds of tricks over the last few years, even putting some of them into their desktop models to reduce cooling needs.

Their real problem might very well be software. I imagine the Windows phone software is probably x86 compatible, but I haven't heard much about the status of the x86 Android version (a quick look at their website looks like they're in the process of merging 2.2 into the repo). I suppose no one is throwing money at it right now, and that might change if Intel can offer a competitive mobile chip but Intel is going to need to be significantly better than the competition for someone to be willing to put money into the project.


By Shining Arcanine on 2/16/2011 12:56:00 AM , Rating: 2
Android can be recompiled for x86 and most Android software runs on the Dalvik virtual machine. Intel could probably run 99% of what most Android users would run on ARM on x86.


RE: Good article except for the drama
By Da W on 2/15/2011 1:35:18 PM , Rating: 2
Instruction set means nothing. I've said it many times here, the X86 decoder footprint gets smaller every generation. At 32nm it's almost negligible. The rest of the CPU design can be anything the chip maker wants it to be.

And Intel has been the manufacturer of the best performing CPUs on the planet for 30+ years (except 2003-2006 when AMD had the crown). ARM design only just implemented dual-core out-of-order execution, most are built on 40nm still.

I believe Intel can pull it off. The problem is the OS, and even if Android runs on x86, will all the apps run too? And do you really need all that speed for a PHONE? When i want speed, i use my desktop.


RE: Good article except for the drama
By vol7ron on 2/15/2011 1:49:09 PM , Rating: 2
quote:
And Intel has been the manufacturer of the best performing CPUs on the planet for 30+ years

You mean to the mass market, as there are other processors that outperform Intel's, as well as processors not available to the public.

quote:
And do you really need all that speed for a PHONE? When i want speed, i use my desktop.

I think you're missing the effect of speed. Speed is also a reduction of power consumption. Less time = less energy. Not to mention, that phones today still aren't great at multitasking, perhaps one day you can keep many apps fully running in the background instead of using save-states.


By TeXWiller on 2/15/2011 4:06:23 PM , Rating: 2
quote:
quote:
And Intel has been the manufacturer of the best performing CPUs on the planet for 30+ years
You mean to the mass market, as there are other processors that outperform Intel's, as well as processors not available to the public.
Lets limit that to the x86 markets and add the time of the race to the first gigahertz as a nod to the AMD camp.


RE: Good article except for the drama
By zozzlhandler on 2/15/2011 4:11:36 PM , Rating: 2
Yes, all the Android apps *will* run on an x86 Android phone, because they are all written in Java, and executed on a VM, not on the native processor.


By psychobriggsy on 2/17/2011 7:22:28 AM , Rating: 2
Apart from all the apps written using the NDK, which is C/C++/assembler, that is.


RE: Good article except for the drama
By Calin on 2/17/2011 6:08:49 AM , Rating: 2
The x86 decoder footprint becomes larger every generation. However, this increase is much smaller than the increase in the rest of the processor. So, while the x86 decoder might be twice as big as the one in Pentium 4, the total transistor count may be ten times that, so the x86 decoder looks to be 5 times smaller.
On the other side, if you half the size of the x86 decoder but decrease the transistor count of the processor 10 times (like in Atom), you end up with the decoder occupying 5 times the percentage area.


RE: Good article except for the drama
By vol7ron on 2/15/2011 1:42:44 PM , Rating: 2
But x86 still includes many legacy instruction sets that are performance/power speed bumps.

It would be nice if we could get off these in the next 10 years.


By PrinceGaz on 2/15/2011 9:02:52 PM , Rating: 3
x86 may have loads of legacy instructions, but with each processor generation, they become less significant.

If old instructions really mattered, would AMD still include support for all the 3DNow! instructions in their latest chips, despite it not being used by anything significant for years (or indeed used for very much at all when it might have made much difference?)

With each generation, and the process-shrinks which are included, legacy support does quite literally become that much smaller, to the point where the cost of supporting the decoding of legacy instructions is negligible, so rather than go down the Pentium Pro route and throwing out all backward compatibility, the sensible approach of keeping hardware support for everything from the start has been adopted.

Even if OS support for 16-bit applications from those eras is being dropped, for the sake of possibly a hundred thousand or so transistors (the total number in the last 16-bit only x86 CPU, the 80286, of which only a small fraction were used for instruction decoding even then) out of hundreds of millions in modern CPUs, and the actual number needed will be much less than that as all they need to do is instruction decoding, it would be madness to remove them.

I'd be the first to agree the x86 instruction-set has grown into a monster which looks like it will never be tamed as it grows ever more extensions, but the little you could gain by removing support for all the oldest instructions will be wiped out many times by the next bunch of SSE instructions.


By geddarkstorm on 2/15/2011 3:38:50 PM , Rating: 1
Really? You say "good article" accept for the drama, and then drum out drama of your own.

Let's wait till these chips are out and actually benchmarked. Atom has been a pretty poor performer so far. And sure, x86 in all appearances blows things away in performance... when you're talking about its performance requiring WATTs rather than the milliwatts of the ARM world. What's the performance to power ratio? That's the real kicker.

Considering what Qualcomm unveiled, I don't think Atom is going to be as incredible in comparison as you think it is. Never mind that Atom is just reaching the OoO and multicore stages too!


By zodiacfml on 2/15/2011 8:38:22 PM , Rating: 2
i just don't understand some people why they think ARM is superior and will take over the world.
i kinda feel the only reason they think that because Apple adopted such on their products.

the performance is really up there and i think power efficiency will come close to the most powerful SoC when it comes out.

the only question is how successful or popular they are. smartphones today are feature packed with decent performance. i do not know of any application that might need more performance in a smartphone, except gaming.


By HotPlasma on 2/16/2011 7:29:05 AM , Rating: 2
quote:
In the mean time, the ARM architecture needs to add cores (and higher power) to raise the performance requirements


Not true. The quad-cores will use only 65% of the power of the current generation.

quote:
The entire package is expected to bump performance in excess of 150 percent, while cutting power consumption by 65 percent


http://www.dailytech.com/Qualcomm+Unleashes+Dual+a...


By psychobriggsy on 2/17/2011 7:20:53 AM , Rating: 2
Facts! Where are the facts - nothing is known about Medfield outside of Intel.

What is known is that NVIDIA have a quad-core Tegra coming out that can beat a 2GHz Core 2 Duo (admittedly in self-performed benchmarking). We know that Atom currently is significantly slower than a Core 2 Duo at 1GHz, never mind 2GHz, so we should be wary of what Intel's marketing is saying, and stop falling for it like DallasTexas has done.

Standby power means the cores are clock gated - you could have a fricking Nehalem core in there and the standby time would still look good. That's because it's not doing anything. Of course every time it gets woken up to process an event it will use power. ARM SoCs are going to be including low power cores (Cortex A5) for this alongside the high power cores (Cortex A9, A15).

The real facts are that ARM offers a product in every phone segment, provided by dozens of manufacturers competing with each other, resulting in a healthy competitive ecosystem. Intel will have to eat its lovely high margins in order to compete - although that's better than zero income. In the end it will get into some Windows 7 tablets, and that's about it.


By JumpingJack on 2/21/2011 2:35:22 AM , Rating: 2
I recall another bold claim by Intel:
http://news.cnet.com/Intel-strikes-back-with-next-...

and was also summarily dismissed
http://news.cnet.com/Intel-makes-performance-claim...

Then came the initial benches:
http://www.anandtech.com/show/1963

And nobody believed it....

Then came the actual data:
http://www.anandtech.com/show/2045/


oohhh - the Larrabee speech
By Fraggeren on 2/15/2011 12:56:39 PM , Rating: 3
Let's see




RE: oohhh - the Larrabee speech
By xeddit on 2/15/2011 1:57:58 PM , Rating: 2
My thoughts exactly... overpromise and underdeliver.

Awesome pic! Now if I could just remember his name..


RE: oohhh - the Larrabee speech
By mdbrotha on 2/15/2011 2:24:43 PM , Rating: 2
Mr. Satan or Hurcle

Dragon Ball Z/GT


RE: oohhh - the Larrabee speech
By morphologia on 2/15/2011 3:24:39 PM , Rating: 3
Now, to make the comparison with Hercule Satan complete, Intel should call its next generation of mobile/ultraportable chips the "Super Megaton Miracle Chips."

Personally, I think "it's a trick!!!"


RE: oohhh - the Larrabee speech
By geddarkstorm on 2/15/2011 3:40:04 PM , Rating: 2
Those ARM chips fast web page loading times? Those absurd graphics on the iPhone 4? Smoke and mirrors!


RE: oohhh - the Larrabee speech
By Master Kenobi (blog) on 2/15/2011 5:05:11 PM , Rating: 2
Speaking of web page loading times, Apple got smacked down in England for their commercials showing unrealistic speeds of the iPhone loading apps and navigating web pages. Smoke and mirrors indeed!


RE: oohhh - the Larrabee speech
By angryplayer on 2/19/2011 3:22:01 AM , Rating: 2
Blog it, dangit!


RE: oohhh - the Larrabee speech
By Jierdan on 2/15/2011 2:26:00 PM , Rating: 2
Yeah, Intel could be putting the cart before the horse. As for the pic, that's Mister Satan.


RE: oohhh - the Larrabee speech
By chick0n on 2/15/2011 2:54:13 PM , Rating: 2
That pic deserves a +10


RE: oohhh - the Larrabee speech
By TeXWiller on 2/15/2011 4:10:13 PM , Rating: 3
Be careful, it might go over 9000 really quick if people keep adding to it like that.


Sounds Impossible
By Flunk on 2/15/2011 12:56:20 PM , Rating: 4
Do you recall when Intel announced they would produce a high-performance discrete GPU (Laraby) based on x86 cores that would compete with AMD and Nvidia? Do you recall how that turned out?

I think the concept of going with x86 on everything is fundamentally flawed (quite like the x86 instruction set). No matter how good Intel's engineers are matching ARM chips on both power and performance is completely impossible. Something has to give for the excess complication of the x86 instruction set.




RE: Sounds Impossible
By Ammohunt on 2/15/2011 2:34:16 PM , Rating: 2
Do you remember when the partnered with HP to create IA64; the next generation of server chips...Still trying to forget! R.I.P. PA-RISC


RE: Sounds Impossible
By name99 on 2/15/2011 3:38:13 PM , Rating: 2
Intel seems very much to be following the lead of Micro "we can so too make Windows work on a phone screen" soft.
It took what, four years, for MS to give up on the dream of Windows everywhere and settle for the rather more practical dream of MS everywhere. And what do you know? Once they actually ditched the stupidity of that idea, they came up with something not bad.

Let's see how long it takes for Intel to realize that x86 everywhere is an equally stupid idea. I suspect Intel could produce a damn fine mobile chip --- if they were willing to get pass the ludicrous idea that anyone actually WANTS x86 compatibility on their phone.


RE: Sounds Impossible
By 91TTZ on 2/15/2011 5:11:05 PM , Rating: 2
quote:
I think the concept of going with x86 on everything is fundamentally flawed (quite like the x86 instruction set). No matter how good Intel's engineers are matching ARM chips on both power and performance is completely impossible. Something has to give for the excess complication of the x86 instruction set.


You're right that making all processors x86 isn't the best for power, speed, or efficiency, but you can see why Intel would want to do it. They own the x86 instruction set for the most part even if companies could copy their designs, they wouldn't be allowed to sell x86 chips on the market.


csb
By Natfly on 2/15/2011 3:40:34 PM , Rating: 2
quote:
And never mind that ARM is built at the same process node as its upcoming Atom-based cores.


Intel still has quite a lead in process tech. Medfield is/will be produced on 32nm. ARM isn't "built at" any specific process node. They are probably still being fabbed from 90nm+ all the way down to the latest 40nm. TSMC and GloFo's 28nm bulk (closest comparable to Intel's 32nm) won't even be ready until later this year, let alone have any product available.

I doubt x86 will wtfpwn arm in the near future, but as others have said with each process shrink the differences between cisc and risc instruction sets shrinks as well. Eventually (if we get that far) the differences will be next to negligible.




RE: csb
By Einy0 on 2/16/2011 11:43:41 AM , Rating: 2
According to industry sources IBM, Global Foundries, Samsung and others in the IBM Technology Alliance are sampling 28nm chips for production within the quarter. TSMC is supposedly not far behind.( Note: TSMC should join the IBM fab club lol... )


Problem for Intel. Not just CPU but GPU as well.
By fteoath64 on 2/19/2011 1:19:38 AM , Rating: 2
And we all know how badly Intel's integrated GPUs are!. Only the ones in SB chips show respectable speeds that those cannot by put into Medfield or similar. Just look at Nvidia Kal El demos of their quad-A9 (aka Tegra 3). It would take intel 3-4 years to do this. Again, it is not about money, it about capability.

The mobile domain is so competitive that matching existing platforms is not good enough. One needs to exceed it by 50% or more and has a decent power budget for whole-day battery life, Maybe Intel should invest in battery research ?.




By JumpingJack on 2/21/2011 2:39:11 AM , Rating: 2
Actually, Medfield will probably license some variant of PowerVR SGX GPU, by Imaginations Technolgies... just as Moorestown uses, since Medfield is the followup to Moorestown with much more integratino.

http://www.anandtech.com/show/3696/intel-unveils-m...

This will be the same graphics found most other mobile devices such as the iPhone, iPad, many Android phones, etc.


New DT depths
By Taft12 on 2/15/2011 12:25:41 PM , Rating: 2
This is the largest gap between headline hot-air and the actual quote I've ever seen at DT (and that's saying a lot)




Intel destroys Atom!
By NellyFromMA on 2/15/2011 12:40:31 PM , Rating: 2
"Intel is confident that, defying the odds, somehow its chips will destroy Atom in power performance."

AHHHH Intel is attacking itself, run!




Not optimistic
By Guspaz on 2/15/2011 2:15:10 PM , Rating: 2
Intel talks a good talk, but their track record is pretty poor. Their Atom parts have largely languished for years since being introduced without seeing any significant upgrades since (most of their updates have been chipset-related, or stuff like moving GPUs on-die, which didn't help the CPU power requirements any).

ARM's products are substantially better than Intel's when it comes to power/performance ratios.

In early 2008, Intel introduced the Atom 230. It was a single-core processor running at 1.6GHz, with a TDP of 4W, and the chipset adds a whole bunch to that. In early 2010, the Atom N450 integrated the GPU, producing a total CPU/chipset power requirement of 6.5W. This is a bit lower than the early Atom processors.

Modern dual-core Cortex A9 processors from ARM will outperform either the 230 or the N450, and have a total TDP of 2W. Intel is still using more than three times the power to deliver the same performance. Heck, Intel's own ULV processors have largely gutted the market formerly dominated by Atom. They consume a lot more power, but still offer a better power/performance ratio.

Intel just has not proven that they can produce a sufficiently power efficient processor to compete with ARM. It's not unreasonable to assume that they'll eventually get there, but early claims about dominating the market are a bit premature.




By cartera on 2/15/2011 2:20:02 PM , Rating: 2
quote:
Never mind the fact that x86, a CISC processor, inherently has more registers

Having lots of registers is a defining feature of RISC-type processors and is generally considered a good thing.




Great image choice.
By Aenslead on 2/15/2011 4:20:06 PM , Rating: 2
I must say that this site usually choses the best images for their news. VERY ad-hoc.




Remember Larrabee.... ?
By kilkennycat on 2/16/2011 10:51:12 PM , Rating: 2
And all the promises by Intel that never materialized....
Don't hold your breath for an x86 phone-savior.

If Intel was really smart and not ossified by their Not-Invented-Here attitudes, they would have already jumped into the ARM camp for cell-phone/tablet SoCs, where their world-leading process technology and volume-production prowess would have given them an automatic edge. It seems that Intel is quite determined to go down with the X86anic. Unlike its famous predecessor, this particular ship has developed a slow leak and the captain is apparently oblivious to the problem... a bit like a frog in water very slowly heated to boiling.




CEO wants to keep his job
By wordsworm on 2/18/2011 7:51:38 AM , Rating: 2
Maybe the CEO of Intel is looking at AMD right now and thinking that he'd better make some big boasts about Intel and the mobile market before his next board meeting... If Meyer had thought of that before it was too late... :P




By vision33r on 2/18/2011 10:16:46 AM , Rating: 2
While Apple is also on ARM with the Cortex A9, it is highly possible that Intel could approach Apple again and give them huge discounts and incentives to convert over to x86.




"Folks that want porn can buy an Android phone." -- Steve Jobs














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki