backtop


Print 65 comment(s) - last by M4gery.. on Jun 30 at 12:57 PM


Haswell CPUs will contain vector processors and a more power on-die GPU. The chips are designed to power the next generation of "Ultrabooks".  (Source: ASUSTek)

An Intel corporate blog post seemed to confirm both the presence of vector coprocessor silicon and a 2013 release date for the 22 nm Haswell.  (Source: Intel)
Company looks to new 22 nm architecture to hold off AMD and ARM Holdings

Intel Corp. (INTC) has dropped a few hints to its upcoming 22 nm Haswell architecture, currently under development by the company's secret Oregon team.  In a post on the Intel Software Network blog titled "Haswell New Instruction Descriptions Now Available!", the company reveals that it plans to launch the new CPU in 2013.

Haswell will utilize the same power-saving tri-gate 3D transistor technology that will first drop with Ivy Bridge in early 2012.  Major changes architecturally reportedly include a totally redesigned cache, fused multiply add (FMA3) instruction support, and an on-chip vector coprocessor.

The vector process, which will work with the on-die GPU, was a major focus of the post.  The company is preparing a series of commands called Advanced Vector Extensions (AVX), which will speed up vector math.  It writes:

Intel AVX addresses the continued need for vector floating-point performance in mainstream scientific and engineering numerical applications, visual processing, recognition, data-mining/synthesis, gaming, physics, cryptography and other areas of applications. Intel AVX is designed to facilitate efficient implementation by wide spectrum of software architectures of varying degrees of thread parallelism, and data vector lengths.

According to CNET, Intel's marketing chief Tom Kilroy indicates that Intel hopes for the new chip's integrated graphics to rival today's discrete graphics.  

Intel has a ways to go to meet that objective -- its on-die GPU in Sandy Bridge marked a significant improvement over past designs (which were housed in a separate package, traditionally), however it also fell far short of the GPU found in Advance Micro Devices (AMD) Llano Fusion APUs.

Intel has enjoyed a love/hate relationship with graphics makers AMD and NVIDIA Corp. (NVDA).  While it's been forced to allow their GPUs to live on its motherboards and alongside its CPUs, the company has also fantasized of usurping the graphics veterans.  Those plans culminated in the company's Larrabee project, which aimed to offer discrete Intel graphics cards.

Now that a commercial release of Larrabee has been cancelled, Intel has seized upon on-die integrated graphics as its latest answer to try to push NVIDIA and AMD out of the market.  Intel is promoting heavily the concept of ultrabooks -- slender notebooks like the Apple, Inc.'s (AAPL) MacBook Air or ASUTEK Computer Inc.'s (TPE:2357) UX21, which feature low voltage CPUs and -- often -- no discrete GPU.

Mr. Kilroy reportedly wants ultrabook manufacturers using Haswell to shoot for target and MSRP of $599 USD, which would put them roughly in line with this year's Llano notebooks from AMD and partners.  It's about $100 USD less than current Sandy Bridge notebooks run.

Intel faces pressure from a surging ARM Holdings plc's (ARMH) who is looking to unveil notebook processors sometime next year.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: wow no kidding
By kleinma on 6/24/2011 12:46:40 PM , Rating: 3
Well they didn't say exactly which discrete cards of today it would rival. If they are saying the top of the line ATI radeon 6XXX series performance will be available in an integrated GPU, that is pretty impressive, even 2 years away. But discrete graphics cards come in a broad range, and they didn't elaborate really, so it is hard to know if I should be impressed or not.


RE: wow no kidding
By Mitch101 on 6/24/2011 1:05:06 PM , Rating: 1
I hope they do I would like a lot more physics in my games. More than just a torn flag waving or other cheap eye candy.

Could also lead to cheap $400.00 laptops that rival todays mid-high range gaming rigs.


RE: wow no kidding
By bug77 on 6/24/11, Rating: 0
RE: wow no kidding
By Targon on 6/24/2011 8:17:26 PM , Rating: 2
Ageia failed because the framerates were horrible when the PPU was being used. Think back to the original 3G cards, where 3DFX dominated, and WHY. There were the 3DFX Voodoo cards, and then you had all the others, most of them not really improving the framerates, even though graphics clarity WAS better.

The Voodoo and Voodoo 2, better graphics, and faster performance, it was a win/win. Look at physics, where if you turn it on, your framerates go down so much you want to turn it off. Software acceleration with the feature on is the only thing that would make the PPU seem like an improvement.


RE: wow no kidding
By epobirs on 6/24/2011 9:34:56 PM , Rating: 2
Ageia was always dooomed even in their best case scenario of delivering something in their discrete product that couldn't be matched on pure performance by the GPU makers. The best they could hope for was to turn a big profit on a buyout by one of the big GPU companies.

The problem was that they could never hope to have a slam dunk, you either have our card or your stuff is hopelessly inferior, situation. The most that they could hope for was a brief window when a few games would get a lot of mileage out of the card. The technology wasn't sufficiently different from what GPGPU enables, meaning the next GPU generation would always offer a substantial chunk of what the discrete card offered but effectively for free in the end users eyes.

Worse, high-end users faced a choice between adding a PPU card or a second GPU card. The latter choice was arguably more versatile, albeit more costly in most cases where the card is fairly recent and strong.

So, even if it worked perfectly and gained wide support, Ageia couldn't ever offer a killer value proposition against the likes of Big GPU.


RE: wow no kidding
By Samus on 6/25/2011 7:34:29 AM , Rating: 2
Hearing Intel talk up their 'next-gen graphics' is like listening to a broken record, and dates back about as far as vinyl as well.

I simply fail to imagine how Intel will out-engineer AMD in GPU design. Having an enemy in nVidia and a direct competitor in AMD, they will have to source their own engineering team, which no matter how well funded, will simply not outpace nVidia or AMD in 2 years.


RE: wow no kidding
By ekv on 6/26/2011 2:43:41 AM , Rating: 2
quote:
I simply fail to imagine how Intel will out-engineer AMD in GPU design.
By and large I'd have to agree with you there. However, it is fair to say that Intel has superior manufacturing capabilities, and that their stumbles (Cougar Point SATA Bug) lately have been relatively minor (contrast with Phenom). In comparison, AMD/ATI and NVidia are rather tied to TSMC, though neither they or GlobalFoundries et al. have and/or can match Intel.

Intel has also shown somewhat surprising adaptability recently in terms of CPU architecture. AMD has seemingly been stagnant, though quite the opposite is likely true, perhaps due to budget / cash-flow difficulties. Which goes to show once again that a sound economic position can help you [buy engineers].


RE: wow no kidding
By HollyDOL on 6/29/2011 2:42:53 PM , Rating: 2
I wouldn't underestimate Intel. Don't forget they can afford to throw much more money in the project. Probably much more than AMD and nVidia could put in together.
That in combination with current technological advantage they have could create them path to really get competitive gpu. Not saying it will, just it could be possible.


RE: wow no kidding
By SPOOFE on 6/26/2011 9:11:11 PM , Rating: 2
quote:
Really, are you that bothered by the fact that when you toss a grenade it has a perfect parabolic trajectory, completely disregarding the air friction?

What a strangely specific example.

I'm of the opinion that stuff looks more real if it moves like it's real. You can slap all the textures and shaders and effects onto a 3D model that you like, even to the point of making it look photorealistic for stills, but if it still moves like a mannequin having a stroke, the effect is ruined.


RE: wow no kidding
By bug77 on 6/27/2011 3:52:41 AM , Rating: 3
That what I was looking for: an example where real physics will have noticeable impact. Cause we have ragdoll physics right now. All I have seen in tech demos was smoke, cloth (both of which will be unnoticed in a fast paced game - maybe useful in an RPG, but that's more about the story and game play) and particles. And particles are useless since we saw that while you may be able to compute a million of them each second, no video card will cope with displaying that many.


RE: wow no kidding
By DanNeely on 6/24/2011 1:35:53 PM , Rating: 2
Actually getting something close on paper wouldn't be that hard. On paper llano is 25% as fast (lower in reality due to the memory bottleneck), and GPUs are still seeing enough design improvements to double yearly, instead of every 18mo. 22nm could get them halfway there all by itself; and intel is claiming trigate's performance boost is the equivalent of a process node which gets them most of the rest of the way there.


RE: wow no kidding
By Belard on 6/24/2011 4:25:25 PM , Rating: 2
For all we know, and most likely... They'll have an on-die GPU that'll be equal to todays discrete video cards... since they are NOT specific and intel is thankfully - bad at video (just stick to CPUs and SSDs), they could be talking about the ATI 6450, a bottom end video card, which is a bit better than what AMD is offering in their Fusion-C CPU/APUs.

So... in two years from now, AMD will have their 9000 series cards (again) and intel will be 2-4 years behind, as usual. And of course, AMD Fusion chips will be more advanced that what they are today.

intel... sometimes does stupid things. That's nice.


RE: wow no kidding
By fic2 on 6/24/2011 10:41:43 PM , Rating: 3
quote:
intel... sometimes does stupid things. That's nice.


Biggest thing I can think of in their resent "GPU competition" mode was ripping out half of the GPU in most of the Sandy Bridge chips sold. Whose stupid idea was it to include the HD3000 in only the 'K' series overclockers parts?


RE: wow no kidding
By fteoath64 on 6/25/2011 3:53:41 AM , Rating: 2
Intel's own stupid idea, of course!. If they had HD3000 in the lower-end chips, they would be competing against themselves which is a bad idea. So crippling the lower-end chip is a "marketing" decision.

I still think they ought to ship variants of the SB chips without any GPU cores in there since they suck and OEMs put in discrete GPU anyway so why part for an energy wasting part which is not used ?.


RE: wow no kidding
By PhatoseAlpha on 6/25/2011 7:38:36 PM , Rating: 2
A quick look over today's PC games shows a very simple reality: Huge numbers of them are console ports, designed to run on hardware that's 6 years old already. The 360 isn't scheduled for replacement until 2015.

You don't need a 2013 graphics card to play a console port. You don't even need a 2011 graphics card unless you're doing something like 6xMultimonitor and Anti-aliasing.

In that light...well, a CPU that can smoothly play WoW and the vast menagerie of console ports without a graphics card doesn't seem quite so stupid.


"Folks that want porn can buy an Android phone." -- Steve Jobs














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki