Print 79 comment(s) - last by someguy123.. on May 7 at 10:27 PM

New GPUs will highlight Intel's 22 nm fourth-gen Core chips

As Intel Corp. (INTCwinds up towards the launch of Haswell Intel's 22 nm node architecture refresh and designated CPU core for fourth-generation Core i-Series processors, it's spilling details on the chips' graphics processing unit.  Haswell cores will be paired with three different tiers of GPUs, depending on the power envelope.

I. Enter a New GPU Brand

The top two tiers of the on-die GPU lineup introduce Intel's first ever branded GPU lineup.  Advanced Micro Devices, Inc. (AMD) has Radeon, NVIDIA Corp. (NVDA) has GeForce, now Intel has announced it will call its high end GPUs "Iris".  In the past it relied on humble numbered die-parts with no separate branding (e.g. last generation's Intel HD 2500 and Intel HD 5000).

Intel had briefly toyed with the idea of releasing Larrabee-branded computation-heavy discrete GPUs.  Ultimately Intel abandoned that project choosing to stick on an embedded path, which took it to Iris.

The new GPUs have been referred to as GT3 in past roadmaps (and shown running Skyrim in demoes).  All of the new chips will pack support for OpenGL 4.0 and DirectX 11.1.

Intel GT3/"Iris" GPU running Skyrim [Image Source: Tiffany Kaiser/DailyTech]

On the lowest end power-wise, Intel's 15 watt Haswell chips (like the one presumably powering the company's 17 mm-thick hybrid ultrabook reference design) will get the HD 5000, a mild upgrade.  The performance increase in this segment is expected to be around 50 percent (over the HD 2500).

Intel Iris Intel graphics
[Click to enlarge]I

AMD's Fusion efforts were ultimately a wakeup call to Intel on the value of a high-quality embeddded GPU.  But it appears that the student has now become the master; the performance of Iris and Iris Pro come closer to matching a discrete GPU than AMD's Fusion chips have thus far.

II. Discrete Graphics Performance in an Embedded Package

Things will start to heat up in the U-Series (like Core i5 branded) 28W mobile CPUs, which will get the new "Iris" branded GPU unit (the full name is Intel Iris HD 5100).  It's roughly 2x faster than the HD 4000.
Intel Iris

The Iris Pro gets a special boost -- new dedicated EDRAM (embedded dynamic random access memory) is also now for the first time included with the GPU part of the die.  On high-end laptop chips -- the Core i7 branded H-series of mobile chips (47-55W of typical power) -- this is expected to again represent about a 2x speedup.
Intel Iris Pro

On the desktop side, Intel's GT3e "Iris Pro" part will get an even bigger boost, reaching a 3x speedup in the R-Series (65-84W) desktop chips.  The M-series laptop and K-Series desktop chips are expected to also have access to Iris Pro, although Intel hasn't revealed their exact level of performance increase.

Intel Iris Pro 2

An Ivy Bridge i7-3770K part scored around 1560 in 3DMark 2011 [source], thus the new Iris Pro-equipped chips should be scoring over 3000, if Intel's performance claims are accurate.  That indicates that the Intel's on-die graphics will be slightly better than a full discrete AMD Radeon HD 7750 GPU which scores around 2900 [source].  
Radeon 7750 HD
The IrisPro on-die GPU is approximately as powerful as last generation's Radeon HD 7750.
[Image Source: AMD]

Whether the real-world performance truly lives up to that remains to be seen, but it's clear that this is Intel's most powerful GPU ever, and worthy of its first-ever branding.

[All slides are courtesy of Intel, via AnandTech]

Sources: Intel via AnandTech, Intel

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

RE: Booo
By FITCamaro on 5/2/2013 12:53:40 PM , Rating: 4
It's slightly faster, more power efficient, and offers a more powerful GPU. How is that stagnant?

Processors are getting to levels where people don't need ones more powerful. They just want ones that use less energy. The focus is on the mobile/laptop space. Not the high end gaming rig.

Sure eventually they'll move on from the current architecture. But right now providing some speed boosts while cutting power consumption is the priority.

RE: Booo
By retrospooty on 5/2/2013 1:58:15 PM , Rating: 4
Exactly... Even nerds like me stopped. I upgraded my PC's on average probably every 6 months since the Pentium 1 was out. Like an obsession, wasting countless dollars on it... But even I stopped at Sandy Bridge. I put it in 2 years ago, skipped Ivy Bridge and am skipping Haswell as well. There just isnt anything I do that needs more than a quad core i5 Sandy Bridge. That will run any game and pretty much anything else I can think of at amazing speeds. There just isnt a compelling reason to upgrade, even for the techies.

RE: Booo
By Jeffk464 on 5/2/2013 5:27:43 PM , Rating: 2
Yup, I'm pretty much in the same boat.

RE: Booo
By Jeffk464 on 5/2/2013 5:28:37 PM , Rating: 2
Core i5 SB is just an outstanding performer.

RE: Booo
By tamalero on 5/3/2013 1:25:17 PM , Rating: 2
corei5, specially if you have an unlocked one.. is a godsend..
need a bit more perf? just overclock it a bit more!.

RE: Booo
By dashrendar on 5/3/2013 11:34:31 AM , Rating: 3
What are the chances that you were in your teens or twenties when you were doing major PC upgrading and now because you're in your 30's or 40's, married and have kids you don't time to do it anymore.

No, I'm not describing my life. ;)

Actually, I am.

RE: Booo
By CZroe on 5/4/2013 11:09:57 AM , Rating: 2
Similar to me too. :)

I stopped years ago with an OC'd Q6600 G0. Several years later and all it really needed was an OC bump (now at 3.0GHz), an SSD, and GPU upgrades (now running tri-SLI with PPU). Well, there's a Core i5 2537M in my Alienware notebook, but both systems could use better graphics these days.

I'll admit: I do a lot of my encoding stuff on my twin brother's i7 system. :P

RE: Booo
By UnauthorisedAccess on 5/3/2013 12:14:27 AM , Rating: 2
I fell that I've been waiting for that '50% better' improvement in CPU processing power before I open my wallet and I get disappointed after each announcement shows a 15% improvement with a large focus on on-die graphical prowess.

Phenom II to Bulldozer wasn't great. Bulldozer to Piledriver wasn't great. Sandy Bridge to Ivy Bridge wasn't great. Ivy Bridge to Haswell is showing a 15% improvement in CPU, so not great.

Yes, I know on-die GPU is the focus and that's showing massive improvements. I personally won't be interested until they negate the ~$150 [7790] graphic card segment (they're already encroaching on the ~$50 [7730] to maybe $100 [7750]) as then I'll maybe consider not buying a graphics card. I know that we have to go through this pain to get there but I feel that we've worn this pain for over 2 years and don't, as the fun passionate consumers we are, deserve another 2 years of 15% CPU improvements.

RE: Booo
By FITCamaro on 5/3/2013 7:58:45 AM , Rating: 2
Yeah my gaming rig is an i5 2500k running at a modest 4.2GHz with 8GB of RAM currently and a 7950. It's already almost a year old and I expect it to last me another 2-3 years at least.

Some people complain about consoles holding back PC gaming. Honestly I've stopped caring about the graphics so much and just want quality titles with a plot and good gameplay. So I care more about that. Console gaming holding games back a bit on the graphics department has saved me the money of constant system upgrades that were kind of necessary nearly every year or two in the early to mid 2000s.

RE: Booo
By Rukkian on 5/3/2013 1:31:24 PM , Rating: 2
The problem is that while the graphics are dumbed down for consoles, so are the story and plot lines. Console titles have always been more flash than substance and have shown that people are willing to buy the same thing year after year and year with just minor tweaks, so why bother making a quality title.

RE: Booo
By deathwombat on 5/2/2013 3:24:20 PM , Rating: 2
It's still stagnant on the CPU side.

I'm from the era when a computer was utterly obsolete after 5 years. In rougly 5 year intervals I upgraded from a Z80 4MHz to a 286-20 to a 486DX 33 to a Pentium II 350 to an Athlon XP 2400+ to a Core 2 Quad 6600. There was never anything wrong with the computers that I upgraded from; they did everything that I wanted them to do. I wanted more performance because it was available, and I found ways to use it once I had it.

So, I'd been using my Core 2 Quad since 2006 and, by 2012, I figured it was time for an upgrade. I knew that the base clock speed on a Core i7 3770 was only 41% faster than what I already had, but I figured that with the addition of HyperThreading, TurboBoost, new instructions like AVX, and improvements to performance per clock, a new computer would crush my Core 2 Quad in the CPU-based distributed computing projects I like to run.

Well, HyperThreading didn't help and usually hurt performance, so I turned that off. In the end, my new computer outperforms my old one by an average of 50% in my DC projects... exactly the same amount that the clock speed increased by when TurboBoost is running on all four cores. 6 years -- more than half a decade! -- and only 50%. That would be good in any other industry, but I'm used to 1000%. Had I known the performance boost would be that small, I wouldn't have wasted my money. A 7-year-old computer is perfectly good today. In fact, by 2016, I doubt that my decade-old Core 2 Quad 6600 will need to be replaced. My wife loves it!

So, now that we're in an age where clock speeds only creep upwards and performance per clock is essentially stagnant, what can be done to entice someone to buy a new computer? Well, Moore's Law still lets Intel double the number of transistors every two years. My DC projects all scale perfectly as the number of cores increase. If I had 4 cores in 2006, new CPUs could have had 8 cores in 2008, 16 cores in 2010, and 32 cores in 2012. Instead, Intel's mainstream chips have stayed at 4 cores, and they've given half the die to a graphics chip that I don't even use!

If Iris is as good as Intel claims, maybe Intel's integrated graphics will eventually be good enough that I won't bother with a discrete graphics card. In the meantime, the stagnation that the OP complained about has left me with no reason to upgrade my now 7-year old computer, and made me a sucker for doing it. Barring some fundamental change like 10+ GHz CPUs made possible by graphene or diamonds, I highly suspect that my Core i7 3770 is the last CPU I'll ever buy. I can't do anything with it that I couldn't do with my computer from 2006, so if a CPU from 2006 is still competitive, between the decline in PC sales, the lack of competition for the performance crown, and the threats to Moore's Law posed by the laws of physics and Moore's Second Law (aka Rock's Law), will there ever be a desktop CPU that can render an i7 3770 obsolete?

RE: Booo
By retrospooty on 5/2/2013 3:36:01 PM , Rating: 2
That's the thing though... If your new CPU was 10x faster than it is now, it still wouldn't operate much faster at all. Almost nothing you do is bottle-necked at the CPU. Open up your task manager and watch it for a while. Your CPU's are probably averaging 1 to 5% utilization most of the day. Faster CPU's arent needed unless some software comes out to utilize what we have now.

RE: Booo
By deathwombat on 5/2/2013 4:19:54 PM , Rating: 3
My CPUs are at 100% all the time. (Or at least they were. I'm taking a break from distributed computing projects at the moment.) There are still applications that will never have enough power, like DC projects that simulate protein folding, find prime numbers, crack encrypted messages, or search for extraterrestrial life. Chess engines will never fail to benefit from being able to search more moves per second.

But yes, there is no need for more power to do most of what we do now. As I pointed out above, when more power became available, new technologies came along to take advantage of it. What would we do if we suddenly had 10 or 100 GHz CPUs? I don't know, but someone would find a use for it, and it might forever change the way we use computers. If you build it, they will come.

RE: Booo
By retrospooty on 5/2/2013 5:38:12 PM , Rating: 2
LOL... True, but the thing about distributed computing is you are basically giving out your idle/spare CPU processing power to whatever cause... It works because there SOOOOOO much free CPU power already. OTher than that and a few other niche things that a very small percentage of people do most CPU's on PC's are less than 5% utilized, way less.

RE: Booo
By Motoman on 5/3/2013 3:57:21 PM , Rating: 2
Well...I learned the hard way about the "free" CPU power you're talking about.

This is a long time ago, mind you...but back in the day I was all about SETI@Home. I had all kinds of PCs running all the time, doing the SETI command-line client. I was in something silly like the top 0.2% of all SETI users worldwide in terms of workunits done (out of something ridiculous like a hundred million users).

Then I had a thought. And for a month, I stopped running all my PCs with SETI all the time. When my electricity bill came for that month, it was about $100 lower than it usually was.

So then I stopped running SETI XD

RE: Booo
By Jeffk464 on 5/2/2013 5:34:12 PM , Rating: 2
so if a CPU from 2006 is still competitive, between the decline in PC sales, the lack of competition for the performance crown, and the threats to Moore's Law posed by the laws of physics and Moore's Second Law (aka Rock's Law), will there ever be a desktop CPU that can render an i7 3770 obsolete?

Pretty much the cpu world has become pretty yawn worthy. All of the current excitement and big improvements are in phone and tablet computing. It still takes a ridiculous amount of money in graphics cards to max out my core i5 Sandy Bridge.

RE: Booo
By ShieTar on 5/3/2013 7:24:45 AM , Rating: 2
Why are you complaining that the 2012 300$-CPU only outperforms your 2007 (Core 2 Quad wasn't even released in 2006) by 40% in one specific, badly optimized software? I mean, first of, the 3770K outperforms a Q6600 by an easy 100% in most benchmark tasks, see .
Second of all, you should compare the 1k$-Release CPUs from both years, thus with a 200% performance boost in Multithreaded applications.

And sure, the Q6600 outperformed a P3 600MHz from the year 2000 by a factor of 25 instead of the factor of 3 you got now, but it also incread the TDP by a factor of 6 to 7.

Oh, and here is a tip: If you do not care about the GPU part, don't buy it. You could have gotten the CPU-Performance of the 3770K without the GPU-part as a Xeon 1230v2, for 2/3rds of the price-tag.

RE: Booo
By ShieTar on 5/3/2013 7:27:28 AM , Rating: 2
Sorry, the 2nd link should have been different from the first one of course:

"If they're going to pirate somebody, we want it to be us rather than somebody else." -- Microsoft Business Group President Jeff Raikes

Latest Headlines
Inspiron Laptops & 2-in-1 PCs
September 25, 2016, 9:00 AM
The Samsung Galaxy S7
September 14, 2016, 6:00 AM
Apple Watch 2 – Coming September 7th
September 3, 2016, 6:30 AM
Apple says “See you on the 7th.”
September 1, 2016, 6:30 AM

Most Popular Articles5 Cases for iPhone 7 and 7 iPhone Plus
September 18, 2016, 10:08 AM
Laptop or Tablet - Which Do You Prefer?
September 20, 2016, 6:32 AM
Update: Samsung Exchange Program Now in Progress
September 20, 2016, 5:30 AM
Smartphone Screen Protectors – What To Look For
September 21, 2016, 9:33 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki