backtop


Print 23 comment(s) - last by foolsgambit11.. on Oct 22 at 4:04 PM


Working Moorestown MID Prototype  (Source: Intel)
Moorestown MID promises ten times the battery life of Atom-powered MIDs

Intel is looking to the future of the Mobile Internet Device (MID) and sees a large market for the devices. Intel has unveiled the first working Moorestown platform at the Intel Developer Forum (IDF) in Taipei. The working device was unveiled during the IDF keynote delivered by Anand Chandrasekher.

During the speech, Chandrasekher talked about the impact that the internet and mobile web has had on the lives of consumers across the world. Chandrasekher said, "Technology innovation is the catalyst for new user experiences, industry collaborations and business models that together will shape the next 40 years. As the next billion people connect to and experience the Internet, significant opportunities lie in the power of technology and the development of purpose-built devices that deliver more targeted computing needs and experiences."

Chandrasekher points to Intel's Atom processor, the upcoming Nehalem processor, and the Moorestown platform as examples of its leadership in products that help deliver internet access to consumers.

The working Moorestown device includes a SOC codenamed Lincroft that integrates a 45nm processor, graphics, memory controller, and video encode/decode onto a single chip. Moorestown also includes an I/O hub codenamed Langwell supporting a variety of I/O ports for connecting to wireless devices, storage, and display devices.

Moorestown is on track, according to Chandrasekher, to reduce power consumption on MIDs by 10x compared to the current Atom powered devices. Moorestown platforms will support a range of wireless technology including 3G, WiMAX, Wi-Fi, Bluetooth, and mobile TV. Intel and Ericsson are working together for a HSPA module optimized for Moorestown. The small modules will measure 25x30x2 mm and have low power requirements.

Minor details of Intel's coming Core i7 processors were also revealed at IDF by Intel's Kirk Skaugen. The i7 processors are set to launch next month and Skaugen says that the parts will provide outstanding performance for gaming and content creation devices.

Intel first introduced the Moorestown platform in May of 2007. The big change with Moorestown from previous generation platforms for the same type of MID devices is that Moorestown integrates the CPU, GPU, and memory controller into one chip. Previous mobile platforms required separate chips for each function.

By combining the functions into one Moorestown chip, Intel was able to save space and power while delivering improved performance. At the time of introduction Intel roadmaps showed that Moorestown devices would be able to last a full day of mixed productivity use and net surfing with approximately 24 hour run time.

Intel didn’t offer any updated availability information at IDF 2008. The original delivery date when Moorestown was introduced was mid-2009. With low cost netbooks supporting the PC industry during the difficult economy around the world today, the Moorestown devices are in a good position in the market.

If the Moorestown MIDs come to market at roughly the same price as the Atom-powered netbooks with much greater battery life, technology fans looking for a low cost option for browsing the internet and checking email could find the Moorestown powered MIDs to be ideal for their needs.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Someone's pissed...
By quiksilvr on 10/20/2008 12:11:45 PM , Rating: 2
I have the feeling that Intel isn't too happy that their integrated graphics not being implemented in the new Macs so they're boasting about this new platform. And who knows? Maybe their integrated graphics chips will implement this technology along with Atom to compete in the new market.




RE: Someone's pissed...
By SunAngel on 10/20/08, Rating: 0
RE: Someone's pissed...
By helios220 on 10/20/2008 12:37:49 PM , Rating: 2
According to Apple PR, their rationale for abandoning Intel integrated graphics in favor of the nVidia integrated offering (9400M) was an up to 6x performance increase over the previous Intel chip, I believe they mentioned size and power consumption gains as well but the nail in the coffin was definiately the graphics performance. Intel even put out a press release almost admitting they got womped in this round, vowing future revenge of course.


RE: Someone's pissed...
By psychobriggsy on 10/20/2008 12:37:52 PM , Rating: 2
Why would Apple choose NVIDIA over Intel?

1) GPU performance is significantly higher - also useful for CUDA, including Photoshop CS4.

2) Single chip chipset saves motherboard space.

3) Potentially cheaper.

4) Intel graphics drivers are notorious for being behind the advertised feature-set for a long time after release.

I don't think that it has lower power consumption.


RE: Someone's pissed...
By Chris Peredun on 10/20/2008 12:49:03 PM , Rating: 2
quote:
I don't think that it has lower power consumption.


It doesn't - the 9400M uses about 3W more power compared to the GM45 if you compare the chips at full load.

Mind you, for that 3W, you get a massive increase in computing power.


RE: Someone's pissed...
By SunAngel on 10/20/2008 12:58:08 PM , Rating: 2
thanks chris for reminding me we were talking mobile instead of desktop. i will gladly go with 3w additional power if it means a significant improvement in 3d performance (even though i don't game and rarely edit photos). hopefully, the pricing on the nvidia mobile chips don't add significantly to the price of the notebooks or netbooks.


RE: Someone's pissed...
By bhieb on 10/20/2008 2:15:39 PM , Rating: 5
quote:
hopefully, the pricing on the nvidia mobile chips don't add significantly to the price of the notebooks or netbooks

Probably not, but the Apple sticker will.


RE: Someone's pissed...
By Samus on 10/20/2008 11:40:11 PM , Rating: 3
nvidia is obviously giving apple a huge price cut compared to intel. remember, this is amidst nvidia recalling a number of GPU's, a loss of over 200 million dollars as a result, and a tarnished reputation to boot.

nvidia needs this, and apple needs performance gpu's. both sides win.


RE: Someone's pissed...
By MonkeyPaw on 10/20/2008 12:58:27 PM , Rating: 2
quote:
intel has a long history of driver improvements that come over time after a product is introduced.


Oh, as opposed to everyone else in the industry? Of all IGP makers, Intel has got to be the worst at driver support. Intel is great at making CPUs, but their IGPs are being pummeled by ATI and nVidia.


RE: Someone's pissed...
By Oregonian2 on 10/20/2008 1:43:25 PM , Rating: 2
You think that they wouldn't boast about their new product if it weren't for Apple? Um... I've never known Intel to ho-hum a new product of theirs -- that which you seem to suggest they would have done w/o Apple's rejection of them in one of their products.


SOC's?
By ineedaname on 10/20/2008 12:17:33 PM , Rating: 2
SOC's sound like a good idea to save power for MID's but its going to suck big time if they eventually transfer SOC's to desktops or even bigger laptops.

Just seems like a way for Intel to monopolize and control the chipset market. Maybe even wireless cards or other things will be implemented into a single chip eventually.

Don't get me wrong I'm all for SOC's on my netbook but I would hate it if 10 years down the road everything is SOC.




RE: SOC's?
By Tsuwamono on 10/20/2008 12:28:21 PM , Rating: 5
Socks are amazing way to save power. I wear big wool ones at home so i can leave the heat lower.


RE: SOC's?
By StevoLincolnite on 10/20/2008 12:42:21 PM , Rating: 2
I agree, I would hate to have System on a Chip in my Desktop, to me I would rather have GPU, WiFi etc stripped out and give me the Processor at a lower price.

It's actually amazing how far technology has come, 12-14 Years ago I laughed at the idea of having a PC running Windows that I could carry around effortless WITH a COLOUR screen, and have ample enough power to play StarCraft 1! - Eventually I beleive with Processors going more parallel and GPU's getting more programmable, that eventually the Super CGPU (Central Graphics Processing Unit) will be the only chip in our computers.


RE: SOC's?
By foolsgambit11 on 10/22/2008 4:04:59 PM , Rating: 2
Yeah, I remember when the CPU(ALU) and the FPU were separate chips. My first computer had an empty socket for an FPU if I wished to 'upgrade'. Shouldn't we go back to that? I mean, would you rather have the FPU stripped out and get the processor at a lower price?

But more realistically, it makes some sense to integrate ALU, FPU, GPU, (possibly PPU, too), specialized decoding pathways (h.264, for example) and memory controller into a single chip - they all access main memory and perform processing calculations on data. Upgrades could take place by adding discrete graphics or physics cards. But the gains of integrating I/O onto the same chip are much lower, so I imagine that will come later, especially in non-power-sensitive applications. So wireless, storage, interface devices, etc. will probably be on a second chip - ICH or South Bridge, or whatever you want to call it - on the desktop and mid- and high-end laptops for a while to come.

But your comment that you'd like to have the features stripped out assumes that the additional features would come with additional costs. That's not forward-looking enough. There will certainly come a time when all of the features we drool about now will become so cheap (nearly free) to implement that it makes sense to include them rather than not. We're still a few years from that point, but I think you may change your mind about SoCs at some point, assuming you live long enough to see technology advance to the point where putting WiFi capability on a chip, for example, costs pennies.


RE: SOC's?
By bobsmith1492 on 10/20/2008 12:53:23 PM , Rating: 2
SOCs are great for portable devices.

They might appear in desktop devices (making them tiny and low-power at the same time, meaning possibly a true desktop - small box on your desk - or built into your monitor.)

For high-performance machines parts will stay separate if for no other reason than heat. Imagine a high-end GPU and CPU in the same chip...

Also there would be so many interconnects to video ram, system ram, etc. that board layout and routing would be expensive (many-layer MBs).


RE: SOC's?
By mattclary on 10/20/2008 12:55:00 PM , Rating: 2
There will always be a need to high-power computing on the desktop. Either a SOC will be available that can crunch huge numbers, or you will have discreet components that can do it. If there is a demand, they will build it.


RE: SOC's?
By Mr Perfect on 10/20/2008 1:00:18 PM , Rating: 2
Well we're getting them anyhow... to a degree. Athlon64 and now i7 have integrated memory controllers, and next we'll have Fusion like CPUs that will have built in graphics too.

As long as desktop boards continue to offer expansion slots, we should actually come out ahead. Take a look at the OEM boxes and laptops out there, most of them are newish CPUs chained to older chipsets. If chipsets get integrated into the CPU, then maybe our PC suppliers won't be able to screw us over like they do by using 945 motherboards to run our Core 2s.


Missing the point
By toyotabedzrock on 10/21/2008 3:15:37 AM , Rating: 2
I think who ever wrote this missed the point. This is not a chip meant for a notebook or netbook, it is aimed at eventual inclusion in smart phone. MIDS are just another steeping stone, like Atom, since they are usually a little bigger and have larger batteries than a phone.




RE: Missing the point
By Diesel Donkey on 10/21/2008 10:30:57 AM , Rating: 2
quote:
If the Moorestown MIDs come to market at roughly the same price as the Atom-powered netbooks with much greater battery life


I think Shane was right on target. He made a clear distinction between netbooks and MIDs.


(What Intel is really hoping for : ) News !
By Oralen on 10/21/2008 5:30:02 PM , Rating: 2
First games on a MID are delayed until 2018... Until then, only Solitaire and Sudoku will be available.

Because if people start caring about graphic performances, even on those tiny machines... They're so screwed...

I mean, I am as impatient to see Larabee as the next man, but do you really think they can become the best in graphics, in every markets, overnight ?

SOC is a great idea... Until you realise that even if one part of that chip sucks, the whole system will.

Let's imagine :

Great Processor ? Check.

Good Memory controller ? Check.

Performance I/O ? Check.

Graphics ?

"Eeeemmm... Boss ? Nvidia is on the phone, but we can't understand them 'cause they can't stop laughing, Tom's team hasn't finished the alpha driver yet, QVGA videos can only play at 10 frame per seconds, but we are sure we can release a patch in... Fifteen months ?"

(Sound of sobbing in the background. Slow fade to black.)




By FishTankX on 10/21/2008 9:06:17 PM , Rating: 2
I'm willing to bet that the DS and PSP have better battery life and lower power consumption than moorestown. Graphics performance can go a long way as long as it's specifically optimized for a platform.

I mean, there are cellphone graphics chips that are more powerful than the DS or PSP.

Now, whether or not the intel graphics chip will suck so badly that Doom will be a risky proposition remains to be seen. But at the very least, software rendering should provide something.


10 times what?
By 3p on 10/20/2008 5:13:48 PM , Rating: 2
With Screen/Wifi/etc burning few Watts even if CPU/chipset spends nothing, they won't be able to get 10 times longer running time. (5W (Atom+Poulsbo) + 3W (the rest of the system), 0W (Moorestown) + 3W (the rest) => 3 times the running time at best)




Uh. Good luck AMD?
By OCedHrt on 10/20/2008 7:18:56 PM , Rating: 2
"The working Moorestown device includes a SOC codenamed Lincroft that integrates a 45nm processor, graphics, memory controller, and video encode/decode onto a single chip."

Sounds like Fushion? Albeit lower power.




"The whole principle [of censorship] is wrong. It's like demanding that grown men live on skim milk because the baby can't have steak." -- Robert Heinlein

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki