backtop


Print 79 comment(s) - last by someguy123.. on May 7 at 10:27 PM

New GPUs will highlight Intel's 22 nm fourth-gen Core chips

As Intel Corp. (INTCwinds up towards the launch of Haswell Intel's 22 nm node architecture refresh and designated CPU core for fourth-generation Core i-Series processors, it's spilling details on the chips' graphics processing unit.  Haswell cores will be paired with three different tiers of GPUs, depending on the power envelope.

I. Enter a New GPU Brand

The top two tiers of the on-die GPU lineup introduce Intel's first ever branded GPU lineup.  Advanced Micro Devices, Inc. (AMD) has Radeon, NVIDIA Corp. (NVDA) has GeForce, now Intel has announced it will call its high end GPUs "Iris".  In the past it relied on humble numbered die-parts with no separate branding (e.g. last generation's Intel HD 2500 and Intel HD 5000).

Intel had briefly toyed with the idea of releasing Larrabee-branded computation-heavy discrete GPUs.  Ultimately Intel abandoned that project choosing to stick on an embedded path, which took it to Iris.

The new GPUs have been referred to as GT3 in past roadmaps (and shown running Skyrim in demoes).  All of the new chips will pack support for OpenGL 4.0 and DirectX 11.1.

Intel GT3/"Iris" GPU running Skyrim [Image Source: Tiffany Kaiser/DailyTech]

On the lowest end power-wise, Intel's 15 watt Haswell chips (like the one presumably powering the company's 17 mm-thick hybrid ultrabook reference design) will get the HD 5000, a mild upgrade.  The performance increase in this segment is expected to be around 50 percent (over the HD 2500).

Intel Iris Intel graphics
[Click to enlarge]I

AMD's Fusion efforts were ultimately a wakeup call to Intel on the value of a high-quality embeddded GPU.  But it appears that the student has now become the master; the performance of Iris and Iris Pro come closer to matching a discrete GPU than AMD's Fusion chips have thus far.

II. Discrete Graphics Performance in an Embedded Package

Things will start to heat up in the U-Series (like Core i5 branded) 28W mobile CPUs, which will get the new "Iris" branded GPU unit (the full name is Intel Iris HD 5100).  It's roughly 2x faster than the HD 4000.
 
Intel Iris

The Iris Pro gets a special boost -- new dedicated EDRAM (embedded dynamic random access memory) is also now for the first time included with the GPU part of the die.  On high-end laptop chips -- the Core i7 branded H-series of mobile chips (47-55W of typical power) -- this is expected to again represent about a 2x speedup.
 
Intel Iris Pro

On the desktop side, Intel's GT3e "Iris Pro" part will get an even bigger boost, reaching a 3x speedup in the R-Series (65-84W) desktop chips.  The M-series laptop and K-Series desktop chips are expected to also have access to Iris Pro, although Intel hasn't revealed their exact level of performance increase.

Intel Iris Pro 2

An Ivy Bridge i7-3770K part scored around 1560 in 3DMark 2011 [source], thus the new Iris Pro-equipped chips should be scoring over 3000, if Intel's performance claims are accurate.  That indicates that the Intel's on-die graphics will be slightly better than a full discrete AMD Radeon HD 7750 GPU which scores around 2900 [source].  
Radeon 7750 HD
The IrisPro on-die GPU is approximately as powerful as last generation's Radeon HD 7750.
[Image Source: AMD]

Whether the real-world performance truly lives up to that remains to be seen, but it's clear that this is Intel's most powerful GPU ever, and worthy of its first-ever branding.

[All slides are courtesy of Intel, via AnandTech]

Sources: Intel via AnandTech, Intel



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: Booo
By Motoman on 5/2/2013 12:25:22 PM , Rating: 2
Virtually no one needs any more CPU power than what's been on the market for the past few years.

And realistically, only a small minority needs any more GPU power - gamers. Who, regardless of all the noise we make, are really a small portion of the overall PC market.

But the fact of the matter is that boosting the CPU power of your gaming rig is unlikely to make any difference in your gaming performance. Upgrading your video card will.

Hence...GPU more important than CPU. Has been for a while...will be for the forseeable future.


RE: Booo
By bug77 on 5/2/2013 12:45:47 PM , Rating: 3
quote:
Virtually no one needs any more CPU power than what's been on the market for the past few years.


Even so, better IPC could mean desktop performance in a 20W envelope. I bet you'd have some use for that.


RE: Booo
By Motoman on 5/2/2013 5:59:24 PM , Rating: 2
That's important to large corporations and data centers.

For the average user with a PC at home? They're not going to notice the power savings.


RE: Booo
By bug77 on 5/2/2013 6:45:05 PM , Rating: 2
Oh you will notice when your laptop will last a day or more on a single charge while being as fast as a desktop 3GHz quad.


RE: Booo
By Motoman on 5/2/2013 7:45:42 PM , Rating: 2
Actually, no I wouldn't. I almost never use my laptop on the battery alone...if I'm near an outlet, I plug in. Saves my battery a lot of discharge/recharge cycles. The battery life of my laptop is all but irrelevant to me.

I use my laptop when I'm sitting on the couch in front of the TV...and there's an outlet *right there*. Or I use it when I'm on the road...and sitting in a cubicle at the office I'm visiting, there's an outlet *right there*. Or in the hotel I'm staying at...they have outlets too.

So...no. I really don't give a rat's a$$ what the battery runtime is on my laptop.


RE: Booo
By inighthawki on 5/3/2013 11:11:26 AM , Rating: 2
And your pretty alone in that. His comment still stands that the "average consumer" will certainly notice lower power draw. Try to imagine that there are people out there that do not have the same priorities as you :)


RE: Booo
By Motoman on 5/3/2013 3:29:38 PM , Rating: 1
...try to imagine that he directed his comment directly at me, personally, and not the "average consumer."

Because he did.

If other people's priorities include senselessly burning out their laptop batteries (and putting up with the low-power performance profile while on battery) then that's their thing. I'm just tired of hearing them whine about having to buy new laptop batteries.


RE: Booo
By inighthawki on 5/3/2013 3:38:16 PM , Rating: 2
Sometimes when people say "you" they aren't directly referring to "only you in this one particular case." Sometimes it is a generic term referring to other people.

e.g. "When opening a can, you would want to use a can opener" - See how that is not necessarily directed directly at you, but is a general statement to people in general.


RE: Booo
By Motoman on 5/3/2013 3:49:57 PM , Rating: 2
Sure...in which one would examine the context in which the statement occurs to determine who the "you" is actually referring to.

I see nothing in the context to indicate he's doing anything but addressing me, individually and personally.


RE: Booo
By inighthawki on 5/3/2013 4:28:09 PM , Rating: 2
That's interesting, because when I read the comment I see nothing in the context that would imply that he IS referring directly to you. I thought he was trying to make a general statement. Maybe we should just ask him :)


RE: Booo
By Motoman on 5/3/2013 6:27:25 PM , Rating: 2
No, you're lying to avoid admitting you're being a duma$s.

Just shut the hell up please.


RE: Booo
By inighthawki on 5/3/2013 9:39:59 PM , Rating: 2
Wow, seriously? I try to be polite about something and you up the rudeness of your remarks? Very mature of you. Internet anonymity at its best.

Good day sir I don't know why I even bothered.


RE: Booo
By Motoman on 5/4/2013 12:46:34 PM , Rating: 2
Good. Maybe then you won't "bother" again. Which would make the world a better place.


RE: Booo
By JasonMick (blog) on 5/2/2013 12:47:22 PM , Rating: 4
quote:
Virtually no one needs any more CPU power than what's been on the market for the past few years.
I disagree somewhat from a computational standpoint.

While it's true per process CPU power has plateaued, there are still very important developments in the CPU ongoing.

First with multiple cores and improved threading (combined with more memory) you can now keep more processes running simultaneously at once. The processes themselves haven't gotten faster, it's the # that can run smoothly that's improved.

Secondly, power has been on a steady decline thanks not only to die shrinks, but to architectural improvements to x86 as well. This is arguably as important or more so than GPU development in laptops.

Thus I agree with part of your premise -- that in single-threaded gaming GPU > CPU -- but I'd like to qualify/disagree with your generalization that this applies to all standard consumer use scenarios.


RE: Booo
By Cloudie on 5/2/2013 2:09:16 PM , Rating: 2
Perfomance per watt is now more important than pure perfomance.

That has translated to vastly better notebook CPUs as desktop class perfomance has filtered down.

Just look at the notebook CPUs from 2010 (the meh that was dual core Arrandale/quad core Clarksfield) compared to a quad core sandy, ivy or haswell. Massive perfomance increase. Dual core is also significantly better though and can be fit into much thinner/smaller form factors.


RE: Booo
By Bad-Karma on 5/3/2013 8:58:22 AM , Rating: 2
quote:
Performance per watt is now more important than pure performance.


That would depend on the user. Someone running a workstation for HPC dependant apps would certainly disagree with you.


RE: Booo
By smilingcrow on 5/3/2013 11:51:05 AM , Rating: 2
Performance per watt also affects Workstations as they are limited by the maximum TDP for CPUs. So for a 150W TDP limit the higher the Performance per watt the higher the performance.


RE: Booo
By Motoman on 5/2/2013 5:58:00 PM , Rating: 2
You're assuming that the vast majority of PC owners use them like you do.

They don't.

Virtually no one needs half of the CPU power they likely have now...and wouldn't notice the difference if you doubled what they have.

Only the small portion of the market that really taxes their computers...and especially gamers...notice any difference. And if you're a gamer, chances are you're already at the "more than enough" level of CPU power - but better graphics always need better GPUs.


RE: Booo
By Bad-Karma on 5/3/2013 9:02:59 AM , Rating: 2
I agree with you as to the PC user base.

But in the microcosm that is the gaming world there are many games, even very new ones, that are more CPU dependent than GPU.


RE: Booo
By inighthawki on 5/3/2013 11:13:45 AM , Rating: 2
There are very few games that are cpu bound and of the ones that are I would claim they're poorly coded.


RE: Booo
By Motoman on 5/3/2013 3:53:26 PM , Rating: 2
I'm not sure that I'd 100% go with the second thought there, but the first one is very valid.

It's a rare game that isn't GPU bound. And if it isn't, the CPU portion of the bottleneck can almost certainly be taken care of with a modest CPU - i.e. your existing CPU must really be a POS.

Virtually any gaming upgrade, if you have even a modest modern CPU, is going to center essentially all on the GPU.


RE: Booo
By someguy123 on 5/3/2013 10:55:37 PM , Rating: 2
That's just silly. "Virtually everyone" has either a very low quality discrete or an integrated system, and the vast majority of people will be saturating their CPU due to poor GPU acceleration, if any. People on GMA integrated are going to peak out their processing power by simply going to youtube/vimeo or trying to play flash games (maybe even browsing websites with flash banners).

"Virtually everyone" will be saturating either their GPU or CPU as APU offerings aren't very powerful and intel HD offerings still have issues with drivers for hardware acceleration. The enthusiasts are the ones more likely to have lots of headroom as they have faster parts for software and games that are incredibly underutilized for browsing or notepad. Average joe caring less about overall performance != lack of utilization.


RE: Booo
By Motoman on 5/4/2013 12:48:48 PM , Rating: 2
You're ridiculously wrong. You honestly think that existing APU designs are going to "peak out their processing power" watching something on Youtube?

You're a tool. Go grab some computer with a low-end AMD APU in it and watch something on Youtube. Then get back to us on how it "peaked out."


RE: Booo
By someguy123 on 5/7/2013 10:27:02 PM , Rating: 2
"Virtually everyone" has an intel system with integrated, most likely still GMA, not an APU. I like how you use ad populum only when it suits your argument. Intel's crap gpus have the bulk of the market and the only way "virtually everyone" would be under utilizing is if modern chips became cheap enough, which requires, you guessed it, faster chips/smaller processes to justify dropping price.


RE: Booo
By sorry dog on 5/2/2013 6:04:34 PM , Rating: 3
quote:
Virtually no one needs any more CPU power than what's been on the market for the past few years.


Or 64K for that matter, right Mr. Gates?


RE: Booo
By Motoman on 5/2/2013 7:46:33 PM , Rating: 2
Walk around an office building and double the CPU power in every computer on every desk.

Then watch and see that essentially no one notices.


RE: Booo
By FITCamaro on 5/3/2013 7:59:29 AM , Rating: 2
Well my office would notice. But then we're doing software development. Not spreadsheets.


RE: Booo
By ShieTar on 5/3/2013 9:54:49 AM , Rating: 3
I'm doing spreadsheets, and I would notice too. Doing post-processing of orbit simulations results available as 10.000-line excel spreadsheets, where the formulas are a bit more non-linear and complex than the simple additions of a financial spreadsheet means I can update a few parameters, press 'F9' to update formulas and go drink a coffee while my Business-Laptop goes into 10 Minutes of calculations.

I don't do enough of this to go and buy a workstation, but I sure was happy with the latest roll-out replacing the old core 2 duo with a new core i5, cutting computation time in half.


RE: Booo
By Rukkian on 5/3/2013 1:21:28 PM , Rating: 2
I can't speak to your situation directly, but most of the time there is another bottleneck, and usually that is the hard drive. Hard drives usually end up being the culprit to slowness in computers.

In most situations, without a faster hard drive, a faster processor will not make a huge difference if you have a fairly up to date cpu.


RE: Booo
By Motoman on 5/3/2013 2:05:10 PM , Rating: 2
...and I would imagine that both you and FIT understand that your PC usage is far from the norm. The vast majority of people aren't compiling code all the time, or doing orbit simulations.

So for you two guys...sure. Crank up your CPU all you like. But the vast majority wouldn't notice.


"And boy have we patented it!" -- Steve Jobs, Macworld 2007














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki