Print 65 comment(s) - last by M4gery.. on Jun 30 at 12:57 PM

Haswell CPUs will contain vector processors and a more power on-die GPU. The chips are designed to power the next generation of "Ultrabooks".  (Source: ASUSTek)

An Intel corporate blog post seemed to confirm both the presence of vector coprocessor silicon and a 2013 release date for the 22 nm Haswell.  (Source: Intel)
Company looks to new 22 nm architecture to hold off AMD and ARM Holdings

Intel Corp. (INTC) has dropped a few hints to its upcoming 22 nm Haswell architecture, currently under development by the company's secret Oregon team.  In a post on the Intel Software Network blog titled "Haswell New Instruction Descriptions Now Available!", the company reveals that it plans to launch the new CPU in 2013.

Haswell will utilize the same power-saving tri-gate 3D transistor technology that will first drop with Ivy Bridge in early 2012.  Major changes architecturally reportedly include a totally redesigned cache, fused multiply add (FMA3) instruction support, and an on-chip vector coprocessor.

The vector process, which will work with the on-die GPU, was a major focus of the post.  The company is preparing a series of commands called Advanced Vector Extensions (AVX), which will speed up vector math.  It writes:

Intel AVX addresses the continued need for vector floating-point performance in mainstream scientific and engineering numerical applications, visual processing, recognition, data-mining/synthesis, gaming, physics, cryptography and other areas of applications. Intel AVX is designed to facilitate efficient implementation by wide spectrum of software architectures of varying degrees of thread parallelism, and data vector lengths.

According to CNET, Intel's marketing chief Tom Kilroy indicates that Intel hopes for the new chip's integrated graphics to rival today's discrete graphics.  

Intel has a ways to go to meet that objective -- its on-die GPU in Sandy Bridge marked a significant improvement over past designs (which were housed in a separate package, traditionally), however it also fell far short of the GPU found in Advance Micro Devices (AMD) Llano Fusion APUs.

Intel has enjoyed a love/hate relationship with graphics makers AMD and NVIDIA Corp. (NVDA).  While it's been forced to allow their GPUs to live on its motherboards and alongside its CPUs, the company has also fantasized of usurping the graphics veterans.  Those plans culminated in the company's Larrabee project, which aimed to offer discrete Intel graphics cards.

Now that a commercial release of Larrabee has been cancelled, Intel has seized upon on-die integrated graphics as its latest answer to try to push NVIDIA and AMD out of the market.  Intel is promoting heavily the concept of ultrabooks -- slender notebooks like the Apple, Inc.'s (AAPL) MacBook Air or ASUTEK Computer Inc.'s (TPE:2357) UX21, which feature low voltage CPUs and -- often -- no discrete GPU.

Mr. Kilroy reportedly wants ultrabook manufacturers using Haswell to shoot for target and MSRP of $599 USD, which would put them roughly in line with this year's Llano notebooks from AMD and partners.  It's about $100 USD less than current Sandy Bridge notebooks run.

Intel faces pressure from a surging ARM Holdings plc's (ARMH) who is looking to unveil notebook processors sometime next year.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

RE: wow no kidding
By MozeeToby on 6/24/2011 12:49:10 PM , Rating: 1
You don't have to have the most powerful product available to be successful, there is more to picking a solution that raw processing speed.

If I could get performance equal to today's high end graphics cards in an integrated solution on a laptop two years from now I would be thrilled; they'd have lower power consumption and heat than a discrete card, fewer moving parts to fail and make noise, and a smaller form factor. And 99% of PC games are designed to run well on 3 year old hardware, so I'd have at least a year of being able to run every AAA title out there.

RE: wow no kidding
By StevoLincolnite on 6/24/2011 3:17:07 PM , Rating: 4
And 99% of PC games are designed to run well on 3 year old hardware, so I'd have at least a year of being able to run every AAA title out there.

Maybe even less than 3 years.
Intel has historically been slow to adopt new Direct X standards, heck past iterations never even supported the full Direct X standard. (I'm looking at you, GMA 900, 910, 915, 950, 3000, 3100 chips with no TnL or vertex shaders.)

End result is that despite the chips having "just enough" grunt to handle a modern game even at the lowest of settings... It didn't have the feature set to run them without additional software like Swift Shader, Oldblivion etc'.

Another example is Sandy Bridge, it is still stuck in the Direct X 10 era despite Direct X 11 being available on nVidia and AMD GPU's for a couple of generations now.

Plus, Intel drivers are just plain bad, slow to be updated and the compatibility just isn't in it.
It took intel what... A year or two just to enable SM3 and TnL on the x3100 chips? Even then TnL was only half done as some games still ran it in software.

Unfortunately for the past decade I've told people if you wish to play video games, then get even a low-end solution from nVidia or AMD and skip the Intel IGP.
The drivers at the very least make it for a much less painful experience.

RE: wow no kidding
By Motoman on 6/24/11, Rating: 0
RE: wow no kidding
By croc on 6/25/11, Rating: 0
RE: wow no kidding
By justjc on 6/25/2011 4:20:56 AM , Rating: 2
Personally, ANY space on the CPU die devoted to graphics alone is wasted die space in my opinion...

Perhaps you're right on the Intel side, as Intel feels it would hurt their processor dominance if GPU acceleration is used.

On the AMD and ARM side however the space used for GPU processors will more and more be the parts delivering the processing power. After all today most browsers have graphics acceleration, Flash have graphics acceleration, Office 11 have graphics acceleration and that's just the programs available today I could remember.

No doubt GPU acceleration will play a bigger role in the future and Intel will have to change their way.

RE: wow no kidding
By SPOOFE on 6/26/2011 9:19:26 PM , Rating: 2
Adds some cost to the chipset, but is far more flexible...

In today's cost-conscious environment, more cost = less flexibility.

"Nowadays you can buy a CPU cheaper than the CPU fan." -- Unnamed AMD executive

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki