backtop


Print 29 comment(s) - last by FaceMaster.. on Sep 16 at 6:19 PM

As Will.i.am says tech enthusiasts will be like "a kid in a candy shop" when they see all this new tech

It's invaluable to get the chance to network with a company and get a chance to witness their vision first hand.  When that company happens to be the world's largest maker of CPUs for traditional personal computers and servers, that opportunity goes from good to great.

At the 2011 Intel Developer Forum we've attended scores of Intel Corp. (INTC) keynotes, technical "deep-dive" sessions and other events, and we're excited at what we've seen.  Here's some of the highlights:

Ivy Bridge

Ivy Bridge is Intel's 22 nm processor, which is set to debut in the first half of next year.  A lot is already known about the chip.  It will be first to use Intel's tri-gate (3D) transistor (aka "FinFET") technology, which we detailed on Tuesday.

But one interesting thing Intel emphasized in its keynote was that while Ivy Bridge was a "tick" (die shrink) from an architectural perspective, the on-die GPU is a "tock".  The GPU has undergone a significant architectural design and appears to pack a lot more power.  In a demo Intel showed the GPU processing and displaying 20 high-definition video streams.

For gamers, a big improvement will be the inclusion of DirectX 11.  With Sandy Bridge, DX11 was largely pointless, as it lacked the processing might to put features like tessellation to good use.  Ivy Bridge's  GPU is quite a different story.



Ivy Bridge was shown off playing Tom Clancy's Hawx 2.  Of course this was a carefully prepared demo, but framerates looked smooth.  This leads us to believe that Intel has at least closed the gaps between Sandy Bridge (which can't even play DirectX 11 content) and rival Advanced Micro Devices, Inc.'s (AMD) top end A-Series of Fusion advanced processing units, which does pretty decent in HAWX 2 as well.



One thing Intel was plugging a lot was "hybrid" tablets, à la ASUSTEK Computer Inc.'s (TPE:2357) Eee Pad Transformer.  The basic idea here is to have a tablet, which can double as a laptop via a tailored keyboard (which may or may not be detachable.  This isn't exactly a new class of devices, but it has yet to see as significant sales as traditional tablets.

However, with Apple, Inc. (AAPL) having been granted sole ownership of "minimalist" tablets in Germany, and with other regions pending rulings, intellectual property may be one key obstacle to this class of devices.  Apple has made it clear that it will sue any tablet maker who "copies its design" (translation: any tablet maker who sees significant sales).

While this hybrid doesn't look much like an iPad, it does have a minimalist tablet face, a multi-touch screen, and a black bezel.  So don't be surprised if these designs see similar lawsuits and sales bans if they do well -- or unless the majority of world courts disallow Apple's litigious crusade.



A small but interesting side note, Intel insisted that Thunderbolt was on its way for Windows.  Thus far Apple has enjoyed exclusive access to the ultra-fast communications standard, which is an early implementation of Intel's LightPeak.

Upsides to Thunderbolt are that it is significantly faster than USB 3.0.  Downsides are that it relies on specialized multi-chip cables that cost $50 USD.



In the realm of future hard drives, SATA is going to be faded out and replaced with connections over PCIe.  Intel is currently going the Advanced Host Controller Interface -- a hardware abstraction layer to manage connections to PCIe hard drives.  Peripheral models employing this tech are coming soon.  They will work with standard eSata cables, but will connect using PCIe lanes.



Obviously this is a inferior solution in the long term.  In that realm Intel is looking to deploy a drivers technology called NVMe.  The tech should offer much more efficient file I/O with hard drives on PCIe connections.  A reference driver is currently complete for Linux which supports 2X PCIe lane transfers.  Intel's engineers say that they are currently working on a similar reference driver for Windows 7, though it isn't complete. 



In related news, Intel was showing off tablets powered by Microsoft Corp.'s (MSFT) slick Windows 8 Metro user interface.  Animations on the x86 tablets looked very fluid (we saw one live in a small private setting).



The look of this tablet is far ahead of iOS 5 or even the relatively graphical Android 3.0 "Honeycomb".  When you consider that this slick GUI is bundled along-side the kind of fully-functional file browsing interface (via the accessible "traditional" Windows screen) that iOS and Android lacks, it won't be surprising if Windows 8 tablets are the hottest item come next year's holiday season.



Last, but not least, Intel happened to mention it had produced a Haswell chip.  While not due to hit the market till 2013, the 22 nm architecture redesign ("tock") promises to make a huge splash in two years.



We probed an Intel executive about whether Haswell would contain a "tick" GPU -- a die shrink -- given that Ivy Bridge packed a "tock" GPU.  The executive hinted that Intel liked to "mix things up" and that sticking to a static two year schedule is too predictable to the competition.  This hints that Haswell might have both CPU and GPU architectural revisions.  If so, that's very exciting news.

Coupled with the 22 nm Tri-Gate tech from the previous model, this should be some powerful bits of silicon when they hit the market.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

That's nice
By bug77 on 9/15/2011 5:52:00 PM , Rating: 4
But can I have the option of not paying for a GPU I will not use back?




RE: That's nice
By Hector2 on 9/15/11, Rating: 0
RE: That's nice
By dagamer34 on 9/15/2011 8:43:27 PM , Rating: 3
For a good 99% of what people do (even you), the dedicated GPU isn't really necessary and the integrated GPU will suffice, even on a desktop. Power efficiency is key, even in desktop computing (or do you not want a cheaper electric bill).

What we do need is a seamless transition between integrated and dedicated GPUs to the point where the user is never able to tell the difference. Leveraging them both, especially in the mobile space, is how we're going to get longer battery life in tomorrow's computers.


RE: That's nice
By xxxdominionxxx on 9/15/2011 9:40:49 PM , Rating: 2
I agree with the GPU point. You would think by todays standards even in the near future, because of the NM shrinks, they could easily put a shrinked (lets say nvidia 8 series) video architecture directly on the CPU. Possibly even only activating certain percentages of the GPU depending on graphic load requiremnts at any given moment.


RE: That's nice
By Onimuto on 9/16/2011 10:21:43 AM , Rating: 2
I would beg to differ. 99% of people maybe in the 1990s. With evolution of tech more and more people are buying cameras still and motion hd models. Then you have the smart phones which are producing hd video and hi res stills. Programs are becoming more and more idot proof I know people who now use video editing , photo editing normally now on there machines. Now still for a good bit of these people the integrated gnu will work just fine. Now for us who use hi end editing software like me who uses adobe cs5.5 master suit the intergraded gnu is about as useless as tits on a boar hog for us. Also anyone playing high end games. Sure I can play my games and use programs like cs 5.5 master or pincal hd on CPU alone but you are stuck with stuttering results. And of course you don't get realtime moddifiying like on photoshop extended , aftereffects, Ect.
You ever tried using HDR on photo shop extended gpus are required for fluid work unless you love the loading bar poping up every two secs. As far as energy cost my msi gt 17.3 inch ananal cost is a meer $170 a year. On 24 hours a day run time. Pcs aren't like your ac electric stove or water heater which contribute to %90 of your monthly bill.


RE: That's nice
By FaceMaster on 9/16/2011 6:19:24 PM , Rating: 1
Um... have you ignored EVERYTHING that has been said about this new processor? The CPU is the main thing for displaying still pictures, and with ivybridge displaying 20 HD video streams at once in realtime, I don't think that even 1% of users will go beyond needing this sort of performance for video-editing work.

You talk about fluid work HDR photoshop editing... once again, I think you'll find you're talking about a very small minority of people. And even then, I'm pretty sure that Ivybridge would suffice. Maybe even sandybridge. I've found gpu support of these sorts of programs fleeting at best.

Finally, you talk about 'high-end games'. I doubt that ivybridge will compete with mid-high end dedicated graphics cards, but when most games are based on a geforce 7000-series / ati X1000 series graphics card (consoles), I'm pretty sure that ivybridge will be capable of a fairly good performance with most games.

The evolution of tech isn't making people need specialised hardware, it's making people need it less. Personally I'd still want a dedicated graphics card for what I do on the computer, but the things you've pointed out in your post are things that a very small minority of people would use the computer for. For everybody else, even if they're doing what you use your computer for, would probably find a CPU enough.


RE: That's nice
By someguy123 on 9/15/2011 9:18:03 PM , Rating: 2
It's bundled in there because of how cheap it is. If they removed it the consumer would probably only see pennies of savings per unit.

May as well keep it in there for people who find it useful. It's like those optical out ports on your motherboard's soundcard. Most people I know don't even know what it is, but I have it hooked up to my external DAC all the time.


RE: That's nice
By bug77 on 9/16/2011 4:13:37 AM , Rating: 2
quote:
It's bundled in there because of how cheap it is.


Cheap? It's 20% of the die size on a Sandy Bridge. probably more on Ivy.
And I didn't say they should remove it, I just said I'd like to see models with no GPU along with those with integrated graphics.


RE: That's nice
By someguy123 on 9/16/2011 1:10:57 PM , Rating: 2
Material measurements don't directly reflect the cost. Being a unified process probably ends up saving them money compared to a separate process for IGPs on motherboards. Its doubtful that they would charge customers less per chip even if it was excluded in certain chips since it would increase costs elsewhere, as there is very high demand for their IGP, however low performing it may be.


RE: That's nice
By Da W on 9/16/2011 1:47:03 PM , Rating: 1
Anyway being a near monopoly on this market they sell at the profit maximizing price (i.e. find the right price that will maximise Quantity X Price) according to the demand they face, which is a function of the demand for PCs, the performance of the chip and the availability of substitute from competitor.

Cost of production has no link with selling price in this industry.


RE: That's nice
By Da W on 9/16/11, Rating: -1
RE: That's nice
By Reclaimer77 on 9/16/11, Rating: 0
"If you look at the last five years, if you look at what major innovations have occurred in computing technology, every single one of them came from AMD. Not a single innovation came from Intel." -- AMD CEO Hector Ruiz in 2007














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki