Print 29 comment(s) - last by NellyFromMA.. on Sep 14 at 2:40 PM

The GT3 is considered the top-of-the-line of Intel Haswell processors

I'm a huge "Skyrim" nerd -- and that's why I was delighted to see it being demoed using the latest Haswell processors at the Intel Developer Forum (IDF) 2012.

For those who are a little behind, Haswell is the codename for Intel's 4th Generation processors -- the successor to the Ivy Bridge architecture. It is based on the 22 nanometer (nm) process node and is the first to be built from top to bottom for power savings and performance in regards to 3D/tri-gate transistors. They will mainly power Ultrabooks and some tablets.

David Perlmutter, executive vice president and general manager of Intel Architecture Group, gave a sneak peek at the Haswell chips at IDF 2012's first keynote yesterday. According to Perlmutter, Haswell will provide 2x the graphics performance of Ivy Bridge -- and this was demonstrated in two videos running side by side, where one was Haswell-powered and the other was Ivy Bridge-powered.

However, I got a closer look at Haswell graphics today. Two monitors were showing a "Skyrim" game in progress, where one was powered by 3rd Generation Core GT2 and the second was powered by Haswell GT3. The difference? Performance level, where the GT3 is considered top-of-the-line. The GT3 has double the number of execution units (EUs) as the GT2 while still maintaining low power comparable to the Intel HD 4000.

The GT3 is running "Skyrim" at 1920x1080 resolution with High settings, while the HD 4000 GPU next door is running the same game at the same frame rate, but at Medium settings and a 1366x768 resolution.

Overall, Haswell will keep certain aspects of Ivy Bridge, like Intel Hyper-Threading, Ring Interconnect and Intel Turbo Boost. However, they provide twice the performance while cutting power significantly.

Here's a shot of the two monitors showing "Skyrim," with GT2 on the left and GT3 on the right:

[Image Source: Tiffany Kaiser/DailyTech]

4th Generation Core GT3:
[Image Source: Tiffany Kaiser/DailyTech]

Haswell chips are expected to be released sometime in the first half of 2013.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

RE: What?
By SPOOFE on 9/12/2012 11:23:38 PM , Rating: 2
Intel's improvements have been relatively great from Sandy Bridge -> Ivy Bridge and now -> Haswell. I say "relatively" because it's still very modest compared to a dedicated graphics card, except for maybe the lowest of the low-end GPU's.

For enthusiast gamers, it probably doesn't directly matter. However, what it DOES accomplish is slowly raise the Lowest Common Denominator when it comes to graphical power available in the average PC. It doesn't make much sense to design games for the ~20 million (let's say) PC's running high-end graphics cards when there's ~200 million (let's say) PC's running wimpy GPU's. That's why Zynga makes the games it does.

But as integrated graphics become "better", PC gaming develops a larger potential audience for high-graphical-fidelity AAA titles. You'll see more effort put into console ports, and more PC-specific titles come out, as more investment moolah heads that way.

RE: What?
By StevoLincolnite on 9/12/2012 11:30:22 PM , Rating: 3
You forget that you can have all the graphics horsepower in the world, but it's not going to mean anything if your drivers are plain awful.

Heck Intel still carries around a game compatibility list. -

RE: What?
By SPOOFE on 9/12/2012 11:39:40 PM , Rating: 3
Actually, yes, I did forget to mention it, and I'm glad you did. :D

RE: What?
By Gondor on 9/13/2012 6:16:32 AM , Rating: 2
It is a "sample list", not a definitive (and exclusive) list of all things that run on Ivy Bridge GPU.

IIRC HD4000 of Ivy Bridge was closing the gap towards AMD Llano quite significantly, so doubling of GPU performance in Haswell should bring Intel at least to Llano level, not far behind still un-released (on desktop) Trinity. Haswell will trash Trinity at CPU-bound tasks, power efficiency and possibly match it in GPU terms ...

AMD had a great opportunity with Trinity this spring but it appears they have had waaaaaaaaaay too many Llano chips in stock and wanted to move those first - a bad decision in retrospet, they won't be selling much Trinity on desktop except to HP/Dell and the like with everybody waiting for Haswell.

The only "gaming" I do is "World of Tanks", which is getting great optimizations in the upcoming 0.8.0 version and it runs on Ivt Bridge as it is. I would have bought A10-5700 if it came out along with mobile chips this spring but after waiting for so long with nothing on the horizon I guess I'll just wait a bit longer and go Intel again.

RE: What?
By StevoLincolnite on 9/13/2012 7:09:34 AM , Rating: 2
Lets be honest, yes it is a "Sample list", but that just says they're not confident in the IGP's just "working" for whatever game or application someone intends to run on them.

Take the Intel X3100 and later x4500 series of IGP's, Intel launched them and claimed how great they were for gaming, yet it took them over a year just to enable TnL, Shader Model 3.0 and such which should have been available on launch.
Then to add insult to injury the TnL implementation was pretty poor where Intel opted to use a game profile system for games that benefited from hardware TnL.

Then you need to account for the much much much poorer image quality that Intel has historically had in comparison to AMD and nVidia.

Heck, the mobile graphics is still a shambles in the Atom CPU line with the GMA 950, 3150 (No TnL, no hardware accelerated 1080P playback.)
The GMA 500, 600, 3600, 3650 which are based on PowerVR and still have horrific drivers.

Sorry to say, but Intel drivers still have a long way to go before they ever gain my confidence.

RE: What?
By Reclaimer77 on 9/13/2012 2:49:24 PM , Rating: 2
You appear to be stuck in the past, especially with your comments about image quality. Anantech proved you wrong months ago, you should head over there and get educated.

AMD's salad days were great, no denying that. But it's over for them, and it's been over for a long time. If a slightly better IGP is all you fanbois have left to cling to, well that's just sad.

I would take a vastly superior CPU over 5-10 more FPS in a game, but that's just me. I enjoy gaming, but it's not the end all of life.

RE: What?
By vol7ron on 9/12/2012 11:32:21 PM , Rating: 2
Does it make sense to design games for the ~20 million when (let's say) ~25 million of the ~200 million want to play those games?

At some point you have to give up the fast-food mentality and say, "this is how we make it, eat it or leave it"

RE: What?
By SPOOFE on 9/12/2012 11:40:50 PM , Rating: 2
Eh. Money is money, and gamers are a very demanding bunch. For as much as people seem to be very vocal about innovation, they tend to slavishly follow the pretty, pretty graphics.

RE: What?
By SlyNine on 9/13/2012 12:18:48 AM , Rating: 2
False dichotomy. They can design a game for both and allow the graphics to scale. Does this require some more work and resources, of course. But you are also more likely to catch more of those 20 million in the process.

RE: What?
By NellyFromMA on 9/13/12, Rating: -1
RE: What?
By NellyFromMA on 9/14/2012 2:40:18 PM , Rating: 1
It's true, sorry if you don't like it but it is. Tough life, right?

RE: What?
By jRaskell on 9/13/2012 4:21:46 PM , Rating: 2
Graphics scaling is a lot like cross platform portability. In theory, they're great ideas. In actual practice, they both come with a variety of potential hurdles and headaches that can take significant time and effort to overcome.

And the larger the breadth of scaling you attempt to accomplish, the more hurdles and headaches you are going to have to deal with. Nothing in life is free. That really is a mantra to live by, and graphics scaling is no exception.

"The Space Elevator will be built about 50 years after everyone stops laughing" -- Sir Arthur C. Clarke

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki