backtop


Print 60 comment(s) - last by SlyNine.. on Sep 15 at 8:20 AM


Surprise! Microsoft's Internet Explorer 10 preview was gracefully running an ARM CPU, unbeknownst to the audience. Microsoft employees let this little secret out later at the conference.  (Source: Engadget)
Watch out Intel and AMD, power efficient ARM processors will soon be able to run Windows

At CES 2011, Microsoft Corp. (MSFT) CEO Steve Ballmer showed off an early build of a next generation Windows operating system running on an ARM architecture CPU.  This week at Microsoft's MIX Developer Conference in Las Vegas, the company gave developers a surprise Easter egg -- a preview build of Internet Explorer 10 and its underlying version of Windows were running on a 1 GHz ARM processor.

Samsung Electronics (005930), Texas Instruments Inc. (TXN), Qualcomm Inc. (QCOM), NVIDIA Corp. (NVDA), and other ARM chipmakers have all been hard at work cooking up power savvy multicore offerings, which would be perfect for a netbook or notebook.  

Versus similarly clocked x86 processors from Intel or AMD, ARM processors would likely squeeze out an hour or two of extra battery life.  While die shrinks and the ever-rising leakage current may eventually largely negate this advantage, in the short term ARM presents the first compelling consumer alternative to x86 in decades.

Windows 8 is expected to insert Microsoft's Ribbon UI element into more locations, including Windows Explorer.  It is also expected to have deeper touch integration and tie together the PC version of Windows with the Metro UI that Microsoft developed for the defunct Zune and Windows Phone 7.

But the addition of ARM support is perhaps the most anticipated feature.

While ARM currently offers power advantages, how compelling a buy Windows ARM portables will be still remains to be seen.  By offering base Windows support, including access to its Office suite and other enterprise tools, Microsoft makes ARM accessible to the everyday consumer.

But exactly how far Microsoft is able to go with its compatibility efforts remains to be seen.  If Microsoft can add ARM support for the Direct X and sound libraries, for example, it would be a relatively trivial exercise for developers to recompile their executables for ARM-architecture Windows 8 computers.

Microsoft makes the world's most used development environment, Microsoft Visual Studio.  By adding tools to make it quick and easy to switch from x86 to ARM builds, Microsoft could make applications compatibility complaints largely a moot point.  

Likewise, if Microsoft can embed an ARM-specific virtual machine in the OS with an x86 emulation layer, it might be possible to run native x86 apps, as is, without recompilation.  This would be helpful in cases where a company didn't have the source and the application developer was unresponsive or unwilling to make the change.  Implementing the same sort of system to provide ARM emulation in x86 Windows would be even more helpful to ARM, because it would allow developers to effectively target the more efficient ARM architecture, while ignoring x86.

Ultimately the question also still remains how low Intel can price its options and how big the true gap in power efficiency will be.  Unlike in the past, Intel may now find its pricing ability hindered by new international scrutiny that prevents it from resorting to anti-competitive arrangements to try to stomp out pesky rivals like ARM. But the exact picture is unclear.

Even more unclear is the fate of Microsoft tablets.  Even if ARM takes off in the notebook space, it may do little to help Microsoft sell Windows tablets, with Apple and Android so deeply entrenched.  In that regard, Microsoft may find that it's just given ARM a free ride to major expansion.  If that's the case Microsoft's customers should still reap minor gains -- a positive for the company -- but Microsoft itself may not make significant in-roads in its market expansion hopes.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: Power vs clock
By vol7ron on 4/13/2011 5:02:58 PM , Rating: 2
Nope, I agree with him. x86 is inferior to RISC for the sheer fact that RISC is a smaller instruction set and x86 includes support for legacy systems (not always a bad thing, but really, really old systems).

With current silicon, the procs will hit a die-shrink wall and ARM will catch up. This is a fact, but we do realize that there are other feasible alternatives to silicon, so that might be for some time.

I'm curious how the Linux community will handle this. I'm not sure if there's a RedHat or Ubuntu support for ARM.


RE: Power vs clock
By omnicronx on 4/13/2011 5:32:54 PM , Rating: 5
Thats clearly a bias one sided way to look at things. Both RISC and CISC have their advantages/disadvantages, but to say one is inferior to another with little to no analysis other than instruction set is flat out incorrect.

The Power series in particular are not consumer desktop parts, they are clearly enterprise parts meant for high-end enterprise servers (these things are huge, I'm not even sure if you could even fit this thing in most desktops). So why the op would be comparing an i7 to an 8 core Power 7 I'm not sure.

Clearly not Apples to Apples.. Want to make a fair comparison, it would be the Nehalem-EX or Itanium line vs the Power 7. (and once again, both have their advantages and disadvantages, the Power 7 may have the best raw performance, but the both Intel lines are more flexible in terms of scalability and also cost a LOT less..)


RE: Power vs clock
By EclipsedAurora on 4/14/2011 1:10:40 AM , Rating: 2
quote:
The Power series in particular are not consumer desktop parts, they are clearly enterprise parts meant for high-end enterprise servers (these things are huge, I'm not even sure if you could even fit this thing in most desktops). So why the op would be comparing an i7 to an 8 core Power 7 I'm not sure.


Not really. All 3 consoles, PS3, 360 and Wii are powered by IBM POWER family CPUs. Also, POWER and Xeon competite in server field as well. IBM mainframe Z11 accept both Xeon or POWER7 as accelerator module. However it is obviously that Xeon competitely lose out to POWER no matter in performance or RAS! Xeon only have the price advantage!

quote:
Thats clearly a bias one sided way to look at things. Both RISC and CISC have their advantages/disadvantages, but to say one is inferior to another with little to no analysis other than instruction set is flat out incorrect.

I'm afraid to tell you that since original Pentium, Inel x86 CPU is an RISC core inside as well. All they did was just transcoding CISC input into RISC processing and than output back to CISC. That's part of the reason why so many transistor count and power consumption is wasted!


RE: Power vs clock
By SlyNine on 4/14/2011 10:12:28 AM , Rating: 3
Again you are not talking about the same CPU, All the CPU's in the consoles would get their asses handed to them by a Core I7,I5, or even I3. So try again.

You are over emphasizing the impact and amount of transistors that backwards compatibility has.


RE: Power vs clock
By Zingam on 4/23/2011 4:23:56 AM , Rating: 2
i3 is at least 5 years newer than any of the console chips!


RE: Power vs clock
By SlyNine on 9/15/2011 8:20:07 AM , Rating: 2
The I7 came out in 2008, the PS3 came out in 06. The I3 is based pretty much on the same tech and is a BUDGET cpu, the Cell was supposed to be a super highend CPU.

My old Core I7 1366 is still high end enough to beat the crap out of a brand new 2011 Core I3.


RE: Power vs clock
By vignyan on 4/13/2011 6:51:00 PM , Rating: 1
Whoa... I can tell that you don't really understand the complications in computer architecture... You think you do, but you don't... Trust me, I am a geek! :)


RE: Power vs clock
By Azethoth on 4/14/2011 12:59:40 AM , Rating: 3
No. You read an article in the early 90's about how RISC (Reduced Instruction Set) was going to dominate over CISC (Complex Instruction Set). Then when that did not actually happen you did not do your homework as to why.

The answer is that the instruction set you send to a CPU only makes up a tiny part of the die which is used to decode it into the actual instruction set used in the CPU. After that step there really is not much separating the two architectures / no intrinsic reason for separation. They can both make use of all the same tricks / latest innovations / design choices etc.

So instead of Intel dying, they beat RISC at its own game and relegated the "legacy" aspects of their design to a tiny tax that pales in comparison to the relentless tick tock of Moore's law at Intel.

Why they used their monopoly power to kick AMD in the face is a mystery to me. They compete quite well without such shenanigans.


RE: Power vs clock
By Strunf on 4/14/2011 7:58:39 AM , Rating: 2
Exactly, the latests CISC CPU are only CISC on the looks, once the decode is done it's just like any other RISC.

The difference I see is that on a CISC CPU the instructions are broken into smaller ones at the CPU level whereas in the case of RISC they are broken into smaller ones at the software level.


RE: Power vs clock
By Kary on 4/14/2011 5:36:12 PM , Rating: 2
Actually I am one of the ones who read the books in the 90s (I was studying electrical engineering specializing in computer systems).
As I recall they said that RISC chips were easier to design and smaller, but produced higher IO on the memory bus and required more RAM for the same program. The RISC instructions are typically function something like this:
Load Register A with RAM location x89030084
CISC chips were more complex to build and tended to be larger since they had to support more instructions, but a single instruction could do something like this:
Load from Memory location x03895783, Add Register A, Multiply by Register B, then save in Register C. Note, those don't have to be separate instructions...that could be done by ONE instruction.....1 read for the instruction...1 read of the data location is all the RAM access needed....versus one (or more) RAM reads per step for RISC.

And yes, Intel switched to RISC internally (the term "micro ops" is preferred here I believe...it's a CPU within the CPU so to speak) so they basically chose the best of both worlds.

...any chance of a complete article along these lines? Seems like a subject that comes up often and is gaining more attention.


"I want people to see my movies in the best formats possible. For [Paramount] to deny people who have Blu-ray sucks!" -- Movie Director Michael Bay














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki