backtop


Print 65 comment(s) - last by M4gery.. on Jun 30 at 12:57 PM


Haswell CPUs will contain vector processors and a more power on-die GPU. The chips are designed to power the next generation of "Ultrabooks".  (Source: ASUSTek)

An Intel corporate blog post seemed to confirm both the presence of vector coprocessor silicon and a 2013 release date for the 22 nm Haswell.  (Source: Intel)
Company looks to new 22 nm architecture to hold off AMD and ARM Holdings

Intel Corp. (INTC) has dropped a few hints to its upcoming 22 nm Haswell architecture, currently under development by the company's secret Oregon team.  In a post on the Intel Software Network blog titled "Haswell New Instruction Descriptions Now Available!", the company reveals that it plans to launch the new CPU in 2013.

Haswell will utilize the same power-saving tri-gate 3D transistor technology that will first drop with Ivy Bridge in early 2012.  Major changes architecturally reportedly include a totally redesigned cache, fused multiply add (FMA3) instruction support, and an on-chip vector coprocessor.

The vector process, which will work with the on-die GPU, was a major focus of the post.  The company is preparing a series of commands called Advanced Vector Extensions (AVX), which will speed up vector math.  It writes:

Intel AVX addresses the continued need for vector floating-point performance in mainstream scientific and engineering numerical applications, visual processing, recognition, data-mining/synthesis, gaming, physics, cryptography and other areas of applications. Intel AVX is designed to facilitate efficient implementation by wide spectrum of software architectures of varying degrees of thread parallelism, and data vector lengths.

According to CNET, Intel's marketing chief Tom Kilroy indicates that Intel hopes for the new chip's integrated graphics to rival today's discrete graphics.  

Intel has a ways to go to meet that objective -- its on-die GPU in Sandy Bridge marked a significant improvement over past designs (which were housed in a separate package, traditionally), however it also fell far short of the GPU found in Advance Micro Devices (AMD) Llano Fusion APUs.

Intel has enjoyed a love/hate relationship with graphics makers AMD and NVIDIA Corp. (NVDA).  While it's been forced to allow their GPUs to live on its motherboards and alongside its CPUs, the company has also fantasized of usurping the graphics veterans.  Those plans culminated in the company's Larrabee project, which aimed to offer discrete Intel graphics cards.

Now that a commercial release of Larrabee has been cancelled, Intel has seized upon on-die integrated graphics as its latest answer to try to push NVIDIA and AMD out of the market.  Intel is promoting heavily the concept of ultrabooks -- slender notebooks like the Apple, Inc.'s (AAPL) MacBook Air or ASUTEK Computer Inc.'s (TPE:2357) UX21, which feature low voltage CPUs and -- often -- no discrete GPU.

Mr. Kilroy reportedly wants ultrabook manufacturers using Haswell to shoot for target and MSRP of $599 USD, which would put them roughly in line with this year's Llano notebooks from AMD and partners.  It's about $100 USD less than current Sandy Bridge notebooks run.

Intel faces pressure from a surging ARM Holdings plc's (ARMH) who is looking to unveil notebook processors sometime next year.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

wow no kidding
By tastyratz on 6/24/2011 12:28:46 PM , Rating: 5
Something to be released in 2 years will rival the technology of today? Amazing!

What a useless comment. Cool if it is competitive to discreet laptop graphics - but usually "its as good as 2 year old tech" is not the bragging point to focus on




RE: wow no kidding
By kleinma on 6/24/2011 12:46:40 PM , Rating: 3
Well they didn't say exactly which discrete cards of today it would rival. If they are saying the top of the line ATI radeon 6XXX series performance will be available in an integrated GPU, that is pretty impressive, even 2 years away. But discrete graphics cards come in a broad range, and they didn't elaborate really, so it is hard to know if I should be impressed or not.


RE: wow no kidding
By Mitch101 on 6/24/2011 1:05:06 PM , Rating: 1
I hope they do I would like a lot more physics in my games. More than just a torn flag waving or other cheap eye candy.

Could also lead to cheap $400.00 laptops that rival todays mid-high range gaming rigs.


RE: wow no kidding
By bug77 on 6/24/11, Rating: 0
RE: wow no kidding
By Targon on 6/24/2011 8:17:26 PM , Rating: 2
Ageia failed because the framerates were horrible when the PPU was being used. Think back to the original 3G cards, where 3DFX dominated, and WHY. There were the 3DFX Voodoo cards, and then you had all the others, most of them not really improving the framerates, even though graphics clarity WAS better.

The Voodoo and Voodoo 2, better graphics, and faster performance, it was a win/win. Look at physics, where if you turn it on, your framerates go down so much you want to turn it off. Software acceleration with the feature on is the only thing that would make the PPU seem like an improvement.


RE: wow no kidding
By epobirs on 6/24/2011 9:34:56 PM , Rating: 2
Ageia was always dooomed even in their best case scenario of delivering something in their discrete product that couldn't be matched on pure performance by the GPU makers. The best they could hope for was to turn a big profit on a buyout by one of the big GPU companies.

The problem was that they could never hope to have a slam dunk, you either have our card or your stuff is hopelessly inferior, situation. The most that they could hope for was a brief window when a few games would get a lot of mileage out of the card. The technology wasn't sufficiently different from what GPGPU enables, meaning the next GPU generation would always offer a substantial chunk of what the discrete card offered but effectively for free in the end users eyes.

Worse, high-end users faced a choice between adding a PPU card or a second GPU card. The latter choice was arguably more versatile, albeit more costly in most cases where the card is fairly recent and strong.

So, even if it worked perfectly and gained wide support, Ageia couldn't ever offer a killer value proposition against the likes of Big GPU.


RE: wow no kidding
By Samus on 6/25/2011 7:34:29 AM , Rating: 2
Hearing Intel talk up their 'next-gen graphics' is like listening to a broken record, and dates back about as far as vinyl as well.

I simply fail to imagine how Intel will out-engineer AMD in GPU design. Having an enemy in nVidia and a direct competitor in AMD, they will have to source their own engineering team, which no matter how well funded, will simply not outpace nVidia or AMD in 2 years.


RE: wow no kidding
By ekv on 6/26/2011 2:43:41 AM , Rating: 2
quote:
I simply fail to imagine how Intel will out-engineer AMD in GPU design.
By and large I'd have to agree with you there. However, it is fair to say that Intel has superior manufacturing capabilities, and that their stumbles (Cougar Point SATA Bug) lately have been relatively minor (contrast with Phenom). In comparison, AMD/ATI and NVidia are rather tied to TSMC, though neither they or GlobalFoundries et al. have and/or can match Intel.

Intel has also shown somewhat surprising adaptability recently in terms of CPU architecture. AMD has seemingly been stagnant, though quite the opposite is likely true, perhaps due to budget / cash-flow difficulties. Which goes to show once again that a sound economic position can help you [buy engineers].


RE: wow no kidding
By HollyDOL on 6/29/2011 2:42:53 PM , Rating: 2
I wouldn't underestimate Intel. Don't forget they can afford to throw much more money in the project. Probably much more than AMD and nVidia could put in together.
That in combination with current technological advantage they have could create them path to really get competitive gpu. Not saying it will, just it could be possible.


RE: wow no kidding
By SPOOFE on 6/26/2011 9:11:11 PM , Rating: 2
quote:
Really, are you that bothered by the fact that when you toss a grenade it has a perfect parabolic trajectory, completely disregarding the air friction?

What a strangely specific example.

I'm of the opinion that stuff looks more real if it moves like it's real. You can slap all the textures and shaders and effects onto a 3D model that you like, even to the point of making it look photorealistic for stills, but if it still moves like a mannequin having a stroke, the effect is ruined.


RE: wow no kidding
By bug77 on 6/27/2011 3:52:41 AM , Rating: 3
That what I was looking for: an example where real physics will have noticeable impact. Cause we have ragdoll physics right now. All I have seen in tech demos was smoke, cloth (both of which will be unnoticed in a fast paced game - maybe useful in an RPG, but that's more about the story and game play) and particles. And particles are useless since we saw that while you may be able to compute a million of them each second, no video card will cope with displaying that many.


RE: wow no kidding
By DanNeely on 6/24/2011 1:35:53 PM , Rating: 2
Actually getting something close on paper wouldn't be that hard. On paper llano is 25% as fast (lower in reality due to the memory bottleneck), and GPUs are still seeing enough design improvements to double yearly, instead of every 18mo. 22nm could get them halfway there all by itself; and intel is claiming trigate's performance boost is the equivalent of a process node which gets them most of the rest of the way there.


RE: wow no kidding
By Belard on 6/24/2011 4:25:25 PM , Rating: 2
For all we know, and most likely... They'll have an on-die GPU that'll be equal to todays discrete video cards... since they are NOT specific and intel is thankfully - bad at video (just stick to CPUs and SSDs), they could be talking about the ATI 6450, a bottom end video card, which is a bit better than what AMD is offering in their Fusion-C CPU/APUs.

So... in two years from now, AMD will have their 9000 series cards (again) and intel will be 2-4 years behind, as usual. And of course, AMD Fusion chips will be more advanced that what they are today.

intel... sometimes does stupid things. That's nice.


RE: wow no kidding
By fic2 on 6/24/2011 10:41:43 PM , Rating: 3
quote:
intel... sometimes does stupid things. That's nice.


Biggest thing I can think of in their resent "GPU competition" mode was ripping out half of the GPU in most of the Sandy Bridge chips sold. Whose stupid idea was it to include the HD3000 in only the 'K' series overclockers parts?


RE: wow no kidding
By fteoath64 on 6/25/2011 3:53:41 AM , Rating: 2
Intel's own stupid idea, of course!. If they had HD3000 in the lower-end chips, they would be competing against themselves which is a bad idea. So crippling the lower-end chip is a "marketing" decision.

I still think they ought to ship variants of the SB chips without any GPU cores in there since they suck and OEMs put in discrete GPU anyway so why part for an energy wasting part which is not used ?.


RE: wow no kidding
By PhatoseAlpha on 6/25/2011 7:38:36 PM , Rating: 2
A quick look over today's PC games shows a very simple reality: Huge numbers of them are console ports, designed to run on hardware that's 6 years old already. The 360 isn't scheduled for replacement until 2015.

You don't need a 2013 graphics card to play a console port. You don't even need a 2011 graphics card unless you're doing something like 6xMultimonitor and Anti-aliasing.

In that light...well, a CPU that can smoothly play WoW and the vast menagerie of console ports without a graphics card doesn't seem quite so stupid.


RE: wow no kidding
By 85 on 6/24/2011 12:47:54 PM , Rating: 4
quote:
Something to be released in 2 years will rival the technology of today? Amazing!


LMAO!

i was thinking the same thing. reminds me of the Guinness commercials where the guys with mustaches say "brilliant!"


RE: wow no kidding
By MozeeToby on 6/24/2011 12:49:10 PM , Rating: 1
You don't have to have the most powerful product available to be successful, there is more to picking a solution that raw processing speed.

If I could get performance equal to today's high end graphics cards in an integrated solution on a laptop two years from now I would be thrilled; they'd have lower power consumption and heat than a discrete card, fewer moving parts to fail and make noise, and a smaller form factor. And 99% of PC games are designed to run well on 3 year old hardware, so I'd have at least a year of being able to run every AAA title out there.


RE: wow no kidding
By StevoLincolnite on 6/24/2011 3:17:07 PM , Rating: 4
quote:
And 99% of PC games are designed to run well on 3 year old hardware, so I'd have at least a year of being able to run every AAA title out there.


Maybe even less than 3 years.
Intel has historically been slow to adopt new Direct X standards, heck past iterations never even supported the full Direct X standard. (I'm looking at you, GMA 900, 910, 915, 950, 3000, 3100 chips with no TnL or vertex shaders.)

End result is that despite the chips having "just enough" grunt to handle a modern game even at the lowest of settings... It didn't have the feature set to run them without additional software like Swift Shader, Oldblivion etc'.

Another example is Sandy Bridge, it is still stuck in the Direct X 10 era despite Direct X 11 being available on nVidia and AMD GPU's for a couple of generations now.

Plus, Intel drivers are just plain bad, slow to be updated and the compatibility just isn't in it.
It took intel what... A year or two just to enable SM3 and TnL on the x3100 chips? Even then TnL was only half done as some games still ran it in software.

Unfortunately for the past decade I've told people if you wish to play video games, then get even a low-end solution from nVidia or AMD and skip the Intel IGP.
The drivers at the very least make it for a much less painful experience.


RE: wow no kidding
By Motoman on 6/24/11, Rating: 0
RE: wow no kidding
By croc on 6/25/11, Rating: 0
RE: wow no kidding
By justjc on 6/25/2011 4:20:56 AM , Rating: 2
quote:
Personally, ANY space on the CPU die devoted to graphics alone is wasted die space in my opinion...


Perhaps you're right on the Intel side, as Intel feels it would hurt their processor dominance if GPU acceleration is used.

On the AMD and ARM side however the space used for GPU processors will more and more be the parts delivering the processing power. After all today most browsers have graphics acceleration, Flash have graphics acceleration, Office 11 have graphics acceleration and that's just the programs available today I could remember.

No doubt GPU acceleration will play a bigger role in the future and Intel will have to change their way.


RE: wow no kidding
By SPOOFE on 6/26/2011 9:19:26 PM , Rating: 2
quote:
Adds some cost to the chipset, but is far more flexible...

In today's cost-conscious environment, more cost = less flexibility.


RE: wow no kidding
By Motoman on 6/24/2011 1:02:35 PM , Rating: 5
Hey now, this is Intel we're talking about. Being only 2 years behind would be an enormous achievement for them.


RE: wow no kidding
By Mitch101 on 6/24/2011 1:41:51 PM , Rating: 2
Now that I think about it Intel was going a different direction than AMD/NVIDIA. Intel was optimizing thier design for raytracing which current GPU's from AMD/NVIDIA are optimized for Rasterization. Since game engines are written around Rasterization this might explain why AMD/NVIDIA have such a lead of the Intel GPU's.

See Image Ray Tracing vs Rasterization
http://www.cdrinfo.com/images/uploaded/Ray-tracedV...

If I recall correctly from a number of articles Real Time Raytracing is a ways off but they would adopt a middle ground combining the best of both before reaching the ultimate goal.

My thought it if Intel can achieve 60FPS through raytracing while AMD/NVIDIA achieve 200FPS through Rasterization although the AMD/NVIDIA offering is faster the end result is Intel could have more realistic visuals.

Of course game engines would need to be written to render as ray tracing which as I understand very easy but the details arent.

Just my thoughts. Dont count out Intel.


RE: wow no kidding
By Motoman on 6/24/2011 2:08:05 PM , Rating: 1
...considering that Intel's marketshare in the Gaming Graphics market is, um, 0%...I wouldn't hold my breath waiting for any game manufacturers to start writing special rendering code for Intel chips.


RE: wow no kidding
By Mitch101 on 6/24/2011 3:47:05 PM , Rating: 2
In Gaming Graphics yes
http://store.steampowered.com/hwsurvey/videocard/

But Intel dominates in Integrated graphics.
http://software.intel.com/en-us/articles/common-mi...

Of course how many integrated graphics machines aren't being used and have a video card plugged in.

Don't underestimate companies with deep pockets.


RE: wow no kidding
By ClownPuncher on 6/24/2011 3:50:26 PM , Rating: 2
They really blew my socks off with larrabee. My larrabee card can play Crysis 4 maxed out!


RE: wow no kidding
By Alexvrb on 6/25/2011 3:28:21 PM , Rating: 2
Oh yeah well I'm running 4 Larrabee cards in SLIfireX on my NF1290FXH97 and I can run Crysis 4 in 6 monitor 3D mode maxed out!


RE: wow no kidding
By Motoman on 6/24/2011 7:50:15 PM , Rating: 1
Call me crazy but I don't think game producers build games with the intention that they'll be played on PCs not intended to play games. Like ones with integrated graphics.


RE: wow no kidding
By SPOOFE on 6/26/2011 9:20:59 PM , Rating: 2
They're absolutely crazy to ignore the largest market in existence. The guys that made Torchlight were geniuses; that game runs excellently on Intel graphics.


RE: wow no kidding
By Motoman on 6/27/2011 11:22:13 AM , Rating: 1
Uh-huh. Never heard of it.

Recon COD runs well on Intel graphics? Rift? Dare I say...Crysis?

If your assertion is correct, then every major gaming company is crazy. None of them do any work to accommodate Intel graphics - most of them buddy up with either ATI or Nvidia as it is.


"This is about the Internet.  Everything on the Internet is encrypted. This is not a BlackBerry-only issue. If they can't deal with the Internet, they should shut it off." -- RIM co-CEO Michael Lazaridis














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki