backtop


Print 56 comment(s) - last by phatboye.. on May 21 at 8:51 AM

Expected to launch 35 tablets by year's end

Intel Corp. is preparing to unveil nearly a dozen new tablet computers that run on its chips at Computex, at the end of May, the Wall Street Journal reports

Intel is seeking to expand beyond PCs and into the mobile market, where ARM has dominated thanks to licensees such as Qualcomm, NVIDIA, and Texas Instruments. 

Intel recently announced its new 22nm 3D Tri-Gate transistors that will boost performance by up to 37 percent compared to existing 32nm technology. It's all part of the company's focus on increasing performance while lowering power consumption -- a move aimed directly at ARM and its hold on the smartphone and tablet market.

Intel is launching a new set of Atom chips, codenamed Oak Trail, specifically for tablets. "While the project improves Intel's position, analysts say the company faces an uphill struggle, as it comes late to the game and is also handicapped by its lack of strong partnerships and applications designed for Android or other popular tablet operating systems, unlike its position in the PC world with Microsoft Inc.'s Windows," WSJ reports.

But Navin Shenoy, Intel's general manager for Asia-Pacific, told WSJ that more than 35 Intel-chip-based tablets are targeted to ship by the end of the year. He also mentioned that component shortages from Japan did not affect Intel's supply chain. 



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Trigate
By Paj on 5/18/2011 8:12:05 AM , Rating: 3
If Intel implements Trigate well, it could be a game changer. If this performs poorly though, they'll probably have a hard time convincing manufacturers with strong ties to ARM to switch to their chips.




RE: Trigate
By paydirt on 5/18/2011 9:01:33 AM , Rating: 3
Yeah, +37% performance at load OR 50% less power draw at load, PLUS 90% less power draw during inactivity.

If Intel really can go to 14nm manufacturing process, I think it will be hard for their competitors to follow suit unless they are getting the 22nm and 14nm technology through licenses from another company...?

Apple will still bring the sexy with their designs and applications, but if they are using technology that is 30%+ slower it will be a hard sell (and Apple is certainly up to that task)


RE: Trigate
By paydirt on 5/18/2011 9:05:48 AM , Rating: 2
The REALLY interesting thing about Intel news yesterday was that Paul Otellini (CEO of Intel) asked, "What company in the mobile space is the most profitable?" "Intel" he answered. For every 600 smartphones or 122 tablets out there, the telecom company needs to buy a _server_. AND they win on RIM, Nokia, Motorola, etc devices and not just Apple devices.

3-D fabrication and the leap in performance will allow Intel to leap ahead of AMD in the server space and prevent/stall an entry into the server space by ARM.


RE: Trigate
By Mitch101 on 5/18/2011 11:02:42 AM , Rating: 2
FinFET is baically 3D transistors and AMD's hope to compete but as usual Intel is 1-2 steps ahead of everyone else in manufacturing. Im not sure where Global Foundries is with this but the nice part of AMD is they can outsource to the first company that can provide it but again Intel is always 1-2 steps ahead in manufacturing.

TSMC pushes FinFET tech back to 14nm process size
http://www.thinq.co.uk/2011/5/16/tsmc-pushes-finfe...


RE: Trigate
By omnicronx on 5/18/2011 1:11:32 PM , Rating: 2
When ARM chips reach dual core 2GHZ do you really think that extra power an x86 chip would provide will be necessary for the vast majority of applications?

They have been perfectly content in the past staying back on technology and waiting for it to mature, so being on the forefront of technology does not seem to be a requisite for Apple success.

I have my doubts Apple will switch anytime soon, they are increasingly becomming more involved in the actual design of ARM SOC's and it makes little sense to go from a custom part in which you have full control, to an Intel controlled part.

ARM just makes more sense for a vertically integrated company like Apple. In the long and short term..


RE: Trigate
By Reclaimer77 on 5/18/2011 9:36:05 AM , Rating: 2
Trigate is the real deal. I fear for AMD. They will either have to pay Intel licensing rights to use the technology, or well, I don't know. But the 3d transistor is a game changer, no CPU without it will be able to compete.


RE: Trigate
By DanNeely on 5/18/2011 10:30:00 AM , Rating: 3
Intel's first to use it in a production environment, but the concepts behind 3d transistors are several decades old. Unless Intel discovered and patented something vital to scaling them from the lab bench to the fab there aren't any licensing rights involved.

The only question is how far out 3d transistors are on GF's roadmap. TSMC's roadmap has them at the 2014(?) time frame which will give Intel a few years of larger process lead in the ARM war. I haven't seen anything about when other fab companies are planning to have their implementations in place.


RE: Trigate
By 91TTZ on 5/18/2011 11:01:22 AM , Rating: 2
There is no reason to fear for AMD. Intel won't let AMD die, since AMD is currently the shield between Intel and antitrust lawsuits.

As far as AMD goes, Intel is very happy with the current setup they have. The competition is under control and they don't want it any other way. However, the mobile space is still up for grabs so Intel is looking for a way to get control of that situation.


RE: Trigate
By Mitch101 on 5/18/2011 11:09:21 AM , Rating: 1
If more applications finally leverage the GPU then AMD stands to get some significant boosts in applications. Intel already has thier GPU on some chips and with AMD building in a GPU to their CPU's finally developers should start leveraging whats available to them for speed improvements.

Im looking at you Compression, Anti-Virus, Graphics (Adobe mainly).

Some apps already take advantage of GPU's Photoshop and some web browsers when the GPU is standard more developers need to leverage which is better or both for their product.


RE: Trigate
By phatboye on 5/18/2011 11:05:38 AM , Rating: 4
http://www.anandtech.com/show/4345/intels-2011-inv...

According to this article Intel isn't bringing tri-gate technology to the Atom arch till sometime in 2013. By that time most of ARM SoC will follow suit with tri-gate a few months later so I doubt this will be much of a game changer because ARM SoC will have the upper hand. You prob won't see much game changing till 2015 when Intel releases it's atom arch on the 14nm process.


RE: Trigate
By omnicronx on 5/18/2011 1:29:24 PM , Rating: 2
Baseless assumptions.

We already know ARM's roadmap through 2013, and I'll give you a guess what is not included.

Tri gate makes little sense above 22nm too, and with Intel on average being at least 18 months ahead of any other foundry, thats at LEAST a year and a half after Intel will have it implemented.

This of course completely overlooks the fact that this has been in the makings for 10 years, and won't see itself actually implemented for almost 13. Let alone the fact that Intel most likely holds many of the patents for the implementation in question.

In other words, you won't be seeing multi gate on ARM chips anytime soon, and it certainly won't be anywhere close to Intel's release. My guess, at least a 2 year advantage if not more from its time of release. (and I think that's being a little bit optimistic)


RE: Trigate
By phatboye on 5/18/2011 3:48:32 PM , Rating: 2
omnicronx I don't really understand what you are trying to say but I think you are confused on a few things. First if you read the article that I linked to you will see that Intel is releasing it's 32nm Atom "Medfield" chip this year this is at around the same time TSMC will start producing 28nm ARM SoC for third parties. So the 18 month lead that you are talking about does not apply to the Atom Architecture. In fact Intel will be behind a lot of ARM SoC in that aspect for a while.

quote:
Let alone the fact that Intel most likely holds many of the patents for the implementation in question.
The fact is that TSMC is currently developing and has already shown prototypes so whether Intel owns patents for this means nothing I'm sure every other manufacturing company has patents as well.

quote:
In other words, you won't be seeing multi gate on ARM chips anytime soon
...and according to the article you won't see it on Atom chips till sometime in 2013 which is still a while away.

quote:
and it certainly won't be anywhere close to Intel's release. My guess, at least a 2 year advantage if not more from its time of release. (and I think that's being a little bit optimistic)
Please leave speculation out of this. If you can't link to soild proof then don't guess.


RE: Trigate
By omnicronx on 5/18/2011 4:15:25 PM , Rating: 2
The 18 months is based on manufacturing process (i understand that the Atom is just hitting 32nm now), i.e Intel's 22nm 1270 manufacturing process will be in production 12-18 months before TSMC's 22nm process.

Intel 22nm with trigate hits fabs 2011 , TMSC 22nm without trigate or anyt type of Finfets tech, second half 2012 at the earliest.

TMSC has already stated they won't be pursuing any FinFets design until 14nm as stated here (which is not even on the roadmap yet, so I don't know what you are talking about on the TSMC front as you are vastly incorrect):

http://www.eetimes.com/electronics-news/4215986/De...

14nm trigate for Intel by 2013, no official timetable for TSMC. Anything that I have seen pegs it at around 2015

"Transistor device technology has had three major innovations since 2000;
TSMC started copper/low-k volume production in 2002 and high-k/metal gate
production in 2011, and will start FinFET production in 2015."

http://7marketspot.com/archives/3203

And yes clearly my speculation is just that, speculation, I was never trying to pass it off as fact and clearly stated that is was my opinion.


RE: Trigate
By phatboye on 5/18/2011 8:03:34 PM , Rating: 2
quote:
The 18 months is based on manufacturing process (i understand that the Atom is just hitting 32nm now), i.e Intel's 22nm 1270 manufacturing process will be in production 12-18 months before TSMC's 22nm process. Intel 22nm with trigate hits fabs 2011 , TMSC 22nm without trigate or anyt type of Finfets tech, second half 2012 at the earliest.

What part of 22nm trigate process will not hit the atom arch until 2013 don't you understand. Yes intel will have 22nm process out this year but It will not come to the atom arch till 2013 . The 22nm process is reserved for Ivy Bridge CPUs and other stuff not Atom until later.

With that being said TSMC will have high performance 28nm out this year. Thus ARM SoC will have an advantage because Intel will still be on 32nm until 2013.


RE: Trigate
By omnicronx on 5/19/2011 1:37:38 PM , Rating: 2
What part do you not understand about 2015 before TSMC goes to 14nm and Trigate like technology do you not understand!

You also clearly are missing the point, while the ATOM may lag behind in manufacturing process, Intel will have 2 years to ramp up production using this new technology for its OTHER LINES.

Thats a good 4 years before TSMC even attempts it.

Are you truly trying to imply Intel won't have a major advantage? Or do you think TSMC just flips the switch and all fabs are suddenly ready for the new technology overnight?

Come 2013 Intel will be ready to just start making chips, as all fabs will now be ready for said technology.


RE: Trigate
By phatboye on 5/20/2011 8:24:32 PM , Rating: 2
quote:
What part do you not understand about 2015 before TSMC goes to 14nm and Trigate like technology do you not understand!


Please link to me where you know TSMC won't release Finfet till the 14nm node. From what I've read they only said that the 28nm and 22nm nodes will be planner, but you seem to forget TSMC will have a 18nm node there has been no word on if that technology will be ready for that node or even when the 18nm node will be ready. They may even have it ready around the time Intel moves to 22nm. Neither one of us knows the answer to that.

quote:
You also clearly are missing the point, while the ATOM may lag behind in manufacturing process, Intel will have 2 years to ramp up production using this new technology for its OTHER LINES. Thats a good 4 years before TSMC even attempts it. Are you truly trying to imply Intel won't have a major advantage? Or do you think TSMC just flips the switch and all fabs are suddenly ready for the new technology overnight?


Yes ramp up takes a while but what you are implying is that it will put them another 2 years behind Intel so that will put them a total of 4 year behind? That is just crazy talk. Yeah TSMC had problems with the 45nm node and it took them a really long time to ramp up productions, but that is the past.

Also even if it takes 2 years to ramp up production and if they get low yields at first that doesn't mean that TSMC can't pump out chips.


RE: Trigate
By lol123 on 5/20/2011 11:15:36 AM , Rating: 2
The article you linked to and based your claim on is about the new Atom micro-architecture, Silvermont, appearing in 2013 (on 22nm). That does not mean, nor does the article say, that Atom will not be on the 22nm process until 2013, only that it won't receive a new micro-architecture until 2013. Look up the tick-tock model as you seem to be lacking some basic knowledge about how Intel plans its roadmaps these days.


RE: Trigate
By phatboye on 5/20/2011 8:07:55 PM , Rating: 2
I am all but familiar with Intel's tick-Tock model having been keeping up with how Intel works for years. And while the article doesn't state specifically that the Saltwell arch won't hit the 22nm node you need to remember 2 things.

1) The Atom arch is not on the tick-tock model and has never been on it. That is why the first gen atoms where on the 45nm node for so long. If you even read the article it even states that.

2) If you clearly look at the picture in the middle of the page entittled "Accelerating the Atom(tm) SoC Roadmap" you can see quite clearly where the node changes occur. You can see toward the Core Arch that there is a clear transition path Nehelem->Sandy Bridge->Ivy Bridge->Future Product->Future Product. And also you can see where the node changes occur. Between Sandy Bridge/Ivy Bridge and Future Product/Future Product. However you do not see a node change in between Saltwell. You don't see a node change until 2012 and that will be for Silvermont. Now you are right the article doesn't explicitly say their won't be a node change for Saltwell but from just looking at that side I would assume their will not be one. Assuming anything else after looking at that slide is just plain silly.


RE: Trigate
By lol123 on 5/20/2011 11:40:58 PM , Rating: 2
You are right, I concentrated on the article text so I forgot to look at the roadmap. If it's really accurate that Intel will not be bringing Atom to 22nm when it's available (I think it's a possibility that the Medfield/Saltwell SoC might be produced on 22nm late in its life without receiving a new codename in the fashion of Sandy Bridge/Ivy Bridge), then I would consider that a departure from what they've been saying about how they want to focus their attention on mobile computing in the future.


RE: Trigate
By phatboye on 5/21/2011 8:51:58 AM , Rating: 2
Well from the way it looks, and the way that article sounded, it looks as though Intel is on track to bring Atom on to a similar 1 year tick tock model as it's Core arch so you can argue that they are focusing more on the mobile side of things. It really takes a lot of time and future planning to make something like moving to a schedule like that happen.


RE: Trigate
By lol123 on 5/20/2011 11:08:06 AM , Rating: 2
Where in the article does it say that?


What about instruction set differences?
By MeesterNid on 5/18/2011 9:45:51 AM , Rating: 2
It's not just as simple as stamping out a new board and glueing a new chip to it. Atom uses x86 instruction set where as ARM has it's own. It will be a pretty big undertaking to port an OS to x86 from ARM.

This is going to be an uphill battle for Intel...not a "strong-arming" like the headline suggests.




RE: What about instruction set differences?
By Flunk on 5/18/2011 10:05:23 AM , Rating: 2
Android has already been ported to x86.


RE: What about instruction set differences?
By Taft12 on 5/18/2011 10:29:10 AM , Rating: 2
Linux was DEVELOPED on x86, what porting???


RE: What about instruction set differences?
By MeesterNid on 5/18/2011 2:21:17 PM , Rating: 3
Android isn't Linux...it has a Java VM that does need porting too.


RE: What about instruction set differences?
By omnicronx on 5/18/2011 2:57:16 PM , Rating: 2
Its not a Linux/GNU distribution, but its clearly Linux based as its built atop the Linux kernel.

Linux is a kernel, it is not an complete OS.

Android exposed Java API's in Android are platform independent, it does not matter what architecture you are running them on. These API's then in term make calls to the internal native libraries. This would imply that pretty much any Android app could work on an x86 version of Android without issue or porting.

Sure davlik would need porting, but my guess is the only reason it has not been done yet is because it was specifically designed for devices with low memory footprints which would result in little advantage on most x86 systems anyways.


RE: What about instruction set differences?
By MeesterNid on 5/18/2011 3:41:19 PM , Rating: 2
Dude, enough of your half-witted BS! Linux as in the OS sense is not a freaking kernel...why don't you try booting up your machine and running with just a kernel! In the strictest sense okay, Linux is a kernel, but it's totally useless in and of itself and because of that nobody talks of Linux and means just the bare kernel. You need a pretty good stack of associated software to run "Linux".


RE: What about instruction set differences?
By omnicronx on 5/18/2011 4:40:59 PM , Rating: 3
You clearly don't understand the basic nomenclature to be having this discussion.

Linux Kernel + GNU = 'Linux as in the OS sense'

Many embedded Linux systems do not run X-windows, are they not linux based either?


RE: What about instruction set differences?
By MeesterNid on 5/18/2011 6:02:31 PM , Rating: 1
Wow dude, at least be consistent! In your previous reply you were arguing that Linux is a kernel, now you're accusing me of not understanding basic nomenclature. You're confused, just stick with Windows.

And why are you bringing up X?


RE: What about instruction set differences?
By omnicronx on 5/18/2011 6:18:59 PM , Rating: 3
For the last time, Linux IS a kernel..

Take a history lesson, the GNU project predates the linux kernel. The Linux Kernel was the last step in what you know as 'Linux' today. The term has been bastardized over he years to imply that Linux equals a complete OS when that is not the case.

http://developer.android.com/guide/basics/what-is-...

Notice how Android sits atop the Linux kernel?

And if you can't figure out why I'm bringing up X-windows and other GNU components, then this conversation ends here..


RE: What about instruction set differences?
By Samus on 5/18/2011 11:44:59 PM , Rating: 3
I think people are so fixated on current tablet trends (iOS/Android) that they're missing the big picture.

Intel tablets are going to obviously run Windows. They could care less about Android and its apps. Current tablets are crap, just oversized phones. People who want apps can use their phones. People who want a real tablet will want something that can run real programs. Outlook with exchange support, Quickbooks, be a domain member, real remote desktop/remote web workplace support, maybe even print.

They'll go for an entirely different market and dominate it, because until Windows 8, ARM wont be able to compete in it.

This is also a clear response to Intel's disapproval of Microsoft porting Windows to RISC architecture.


By ekv on 5/19/2011 6:03:38 AM , Rating: 2
quote:
This is also a clear response to Intel's disapproval of Microsoft porting Windows to RISC architecture.
Maybe. Often however, MSFT will quiz Intel on what technology is in the dev pipeline, in order to gauge their research efforts. [Microsoft also does some work on the OEM hardware side of things.]

Recall Jobs picking Intel over AMD several years ago. Right when AMD was clearly out-performing Intel P4's and what-have-you. Did Jobs have inside info on what Core would do to AMD offerings?

In other words, is it possible MSFT had an idea of what Intel would be announcing at Intel's annual meeting except still made the call to partner with and/or venture into ARM territory?

[Ok, kinda crazy ... unless TSMC (or GF or Charter, etc.) plans to step up their 22nm efforts.]


By rupaniii on 5/19/2011 8:46:46 PM , Rating: 2
Uhm... the development environment seems to run great on pc. Keep arguing.


By micksh on 5/18/2011 10:38:11 AM , Rating: 2
There is huge lag between Android on ARM and on x86. The most recent Android x86 version is 2.2. Before April this year the latest was 1.6.

And in recent news there will be no legacy app compatibility in ARM version of Windows 8. Only Windows 8 x86 will run old applications. So, Intel has a strong position.


RE: What about instruction set differences?
By bug77 on 5/18/2011 10:24:14 AM , Rating: 2
It's mostly Linux. It runs on pretty much anything.


RE: What about instruction set differences?
By MeesterNid on 5/18/2011 2:16:40 PM , Rating: 2
Yeah, okay...pretty sure the Dalvik VM, which is the HUGE part of the Android stack, can't just be dropped into any random Linux os environment and be expected to run.


RE: What about instruction set differences?
By omnicronx on 5/18/2011 3:07:34 PM , Rating: 2
You have the right idea about Dalvik and x86 in general, but please stop talking about nix..

Linux supports many architectures, from PPC to x86 to ARM and many of the major distributions support ARM.

I.E If that random Linux OS environment happened to be ARM based, then your statement may not hold true. Dalvik support in a nix environment has little to do with nix itself, but the requirement of ARM based architecture.(as it currently stands)

and FYI: Dalvik (and Android in general) has already been ported to x86, just not by Google. It currently supports 2.2, and they are currently working on a GB release.


RE: What about instruction set differences?
By MeesterNid on 5/18/2011 3:43:59 PM , Rating: 2
No, you stop talking about Linux...look at my reply above to your ravings. And I don't need your confirmation of my "right idea" about Dalvik as I'm pretty sure my understanding of a Java Virtual Machine is good bit better than your's judging from some of your comments.


By omnicronx on 5/18/2011 4:40:46 PM , Rating: 2
You clearly don't understand the basic nomenclature to be having this discussion.

Linux Kernel + GNU = 'Linux as in the OS sense'

Many embedded Linux systems do not run X-windows, are they not linux based either?


RE: What about instruction set differences?
By omnicronx on 5/18/2011 4:43:05 PM , Rating: 2
quote:
I'm pretty sure my understanding of a Java Virtual Machine is good bit better than your's
I agree, I know little to nothing.

I'm a Java developer, but that's beside the point..


By MeesterNid on 5/18/2011 6:07:32 PM , Rating: 2
Ooooh, I'm impressed! Give me a minute to shake off the awe...I also happen to have written some Java here and there, but that alone doesn't mean that I have a good understanding of the VM since it's pretty well abstracted from the day-to-day coder. That's a separate activity in itself, the exploration.


By stimudent on 5/19/2011 11:00:42 AM , Rating: 2
Sounds like Intel is going to go to its trusty Book of Ethics Violations to strong-arm its way into the market.


MeeGo a design win??
By tlbj6142 on 5/18/2011 8:15:42 AM , Rating: 2
Is MeeGo really a design win?




RE: MeeGo a design win??
By Flunk on 5/18/2011 9:40:21 AM , Rating: 2
What's a MeeGo?


RE: MeeGo a design win??
By chagrinnin on 5/18/2011 10:15:02 AM , Rating: 5
It's Spanish for "friend". Remember that movie,..."Three Meegos",...starring Steve Martin, and uh,...two other guys?


RE: MeeGo a design win??
By bplewis24 on 5/18/2011 10:59:17 AM , Rating: 3
How dare you call Martin Short & Chevy Chase the "other guys." The Three Meegos were awesome, and coincidentally they tripled the battery life of just one Meego.

Brandon


RE: MeeGo a design win??
By Gzus666 on 5/18/2011 10:40:42 AM , Rating: 5
Caveman for "I Go".


Process technology is all well and good...
By Amiga500 on 5/18/2011 9:07:00 AM , Rating: 2
But if you cannot watch a HD move without the sucker running at high capacity, your power consumption is still gonna be higher than an efficient design idling...




RE: Process technology is all well and good...
By shompa on 5/18/2011 12:50:32 PM , Rating: 2
Computers are getting to be "fast enough". It is only tech geeks, gamers and heavy computer users who need the newest intel chip. If you check the load of an ordinary PC you will see that it idles 90%+.

PowerPC/ARMs has 2 huge advantages over X86. It is cheap to build and draws little energy. Early next year ARMs will be quod core up to 2.5ghz and draw about 2.5 watt. That computing capacity is enough for almost everybody. You can use Office, surf, Photoshop and video encoding. A quod core ARM + quod core graphic costs about 25 dollars to OEM.

ARM has to fix 2 things. 64bit and an interconect so you can use many processors. Dual quod core 2.5 ghz ARMs. 5 watt.

This is the reason why finally X86 will die and be a niche market, just like SPARC was 10 years ago.

The fastest computers in the world uses Nvidia/AMD graphics for computing.
The low end/middle end will be ARM.

Now if everybody started to use ARM, stopped using Flash we could close 20+ nuclear reactors in the world just from the energy save.
And if Windows finally disappears, we will have flying cars and other innovations.


By Oni No Kami on 5/18/2011 3:30:28 PM , Rating: 2
For ARM to be widely used in the personal computing space (PCs), they either have to develop an x86 processor or convince software developers to support their platform.

They already have a potential OS if Windows 8 will indeed support ARM.


By lol123 on 5/20/2011 11:19:15 AM , Rating: 2
That's one of the reasons why the SoC has a GPU.


Oh boo
By MrTeal on 5/18/2011 10:24:49 AM , Rating: 2
I saw the headline and thought that Intel was going to be bringing back the StrongARM name and competing in the ARM space. Too bad, a Tri-Gate ARM on a 22nm process might give me a nice smartphone with a few days of battery life.




RE: Oh boo
By Fluppeteer on 5/18/2011 10:41:55 AM , Rating: 2
Quite. For those without the history: Digital's Alpha team made an ARM-compatible StrongARM chip tuned to the Alpha's process, which - due to this tuning - hit 200MHz when the ARM7 hit 40MHz and the ARM8 was failing to appear - this was around the time of the 133MHz Pentium. Intel acquired the team in question and produced XScale chips for a while. ARMs in general are designed to be quite process-portable; it's very possible that Intel could produce an ARM core tuned to their process at a very high clock speed/low power.

The problem is that ARMs tend to get used in highly-integrated systems - the ARM is effectively a bolt-on bit of custom IP, just like a memory controller or TMDS transceiver. I doubt Intel are in the market for producing lots of variations to accommodate their customers - their current strategy (at least in CPU space) is shoving a few similar parts out the door and getting economies of scale.


RE: Oh boo
By DanNeely on 5/18/2011 10:59:37 AM , Rating: 2
A year or two ago they signed a deal with TSMC to make atom based SoC's to the customers specs. When interest failed to materialize they quietly scrapped it. There's no reason to assume they wouldn't re-approve it if interest were to materialize in the future.


It runs more efficient...
By tayb on 5/18/2011 9:17:25 AM , Rating: 2
Compared to currently inefficient Atom CPU's. Intel still has a really long way to go before we see an Intel branded CPU in a smartphone and I'll hold judgement on the tablet foray until I see the end result.

Judging by previous Intel forays into tablets I can expect horrible battery life and if that is the case Intel will still be grasping at straws.




RE: It runs more efficient...
By DanNeely on 5/18/2011 10:36:09 AM , Rating: 2
Take a look at one of the slides Anandtech posted; although a decent chunk probably has to do with arm being stuck at 40nm vs 32nm for Intel's Atom SoC's they've got power usage in the very low segment of the current ARM range. Most of the reason why existing atoms lagged arm so badly was that they were all built on a performance process not a low power process (10% slower, 10x less idle power, load power reduced by ????). OTOH the 45nm low power process atoms Intel demoed last year were in the middle of the pack for power consumption, so they'll probably still be somewhat competitive between when TSMC fields its 28nm process and intel its 22nm.


"Young lady, in this house we obey the laws of thermodynamics!" -- Homer Simpson














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki