backtop


Print 29 comment(s) - last by NellyFromMA.. on Sep 14 at 2:40 PM

The GT3 is considered the top-of-the-line of Intel Haswell processors

I'm a huge "Skyrim" nerd -- and that's why I was delighted to see it being demoed using the latest Haswell processors at the Intel Developer Forum (IDF) 2012.

For those who are a little behind, Haswell is the codename for Intel's 4th Generation processors -- the successor to the Ivy Bridge architecture. It is based on the 22 nanometer (nm) process node and is the first to be built from top to bottom for power savings and performance in regards to 3D/tri-gate transistors. They will mainly power Ultrabooks and some tablets.

David Perlmutter, executive vice president and general manager of Intel Architecture Group, gave a sneak peek at the Haswell chips at IDF 2012's first keynote yesterday. According to Perlmutter, Haswell will provide 2x the graphics performance of Ivy Bridge -- and this was demonstrated in two videos running side by side, where one was Haswell-powered and the other was Ivy Bridge-powered.

However, I got a closer look at Haswell graphics today. Two monitors were showing a "Skyrim" game in progress, where one was powered by 3rd Generation Core GT2 and the second was powered by Haswell GT3. The difference? Performance level, where the GT3 is considered top-of-the-line. The GT3 has double the number of execution units (EUs) as the GT2 while still maintaining low power comparable to the Intel HD 4000.

The GT3 is running "Skyrim" at 1920x1080 resolution with High settings, while the HD 4000 GPU next door is running the same game at the same frame rate, but at Medium settings and a 1366x768 resolution.

Overall, Haswell will keep certain aspects of Ivy Bridge, like Intel Hyper-Threading, Ring Interconnect and Intel Turbo Boost. However, they provide twice the performance while cutting power significantly.

Here's a shot of the two monitors showing "Skyrim," with GT2 on the left and GT3 on the right:

[Image Source: Tiffany Kaiser/DailyTech]


4th Generation Core GT3:
 
[Image Source: Tiffany Kaiser/DailyTech]

 
Haswell chips are expected to be released sometime in the first half of 2013.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

What?
By Joz on 9/12/2012 10:48:25 PM , Rating: 2
Skyrim can run at all on Intel GPUs?

When the hell did that happen? What rock did I live under for the last year? And why the hell isn't apple suing Intel for making square silicon devices?




RE: What?
By SPOOFE on 9/12/2012 11:23:38 PM , Rating: 2
Intel's improvements have been relatively great from Sandy Bridge -> Ivy Bridge and now -> Haswell. I say "relatively" because it's still very modest compared to a dedicated graphics card, except for maybe the lowest of the low-end GPU's.

For enthusiast gamers, it probably doesn't directly matter. However, what it DOES accomplish is slowly raise the Lowest Common Denominator when it comes to graphical power available in the average PC. It doesn't make much sense to design games for the ~20 million (let's say) PC's running high-end graphics cards when there's ~200 million (let's say) PC's running wimpy GPU's. That's why Zynga makes the games it does.

But as integrated graphics become "better", PC gaming develops a larger potential audience for high-graphical-fidelity AAA titles. You'll see more effort put into console ports, and more PC-specific titles come out, as more investment moolah heads that way.


RE: What?
By StevoLincolnite on 9/12/2012 11:30:22 PM , Rating: 3
You forget that you can have all the graphics horsepower in the world, but it's not going to mean anything if your drivers are plain awful.

Heck Intel still carries around a game compatibility list. - http://www.intel.com/support/graphics/intelhdgraph...


RE: What?
By SPOOFE on 9/12/2012 11:39:40 PM , Rating: 3
Actually, yes, I did forget to mention it, and I'm glad you did. :D


RE: What?
By Gondor on 9/13/2012 6:16:32 AM , Rating: 2
It is a "sample list", not a definitive (and exclusive) list of all things that run on Ivy Bridge GPU.

IIRC HD4000 of Ivy Bridge was closing the gap towards AMD Llano quite significantly, so doubling of GPU performance in Haswell should bring Intel at least to Llano level, not far behind still un-released (on desktop) Trinity. Haswell will trash Trinity at CPU-bound tasks, power efficiency and possibly match it in GPU terms ...

AMD had a great opportunity with Trinity this spring but it appears they have had waaaaaaaaaay too many Llano chips in stock and wanted to move those first - a bad decision in retrospet, they won't be selling much Trinity on desktop except to HP/Dell and the like with everybody waiting for Haswell.

The only "gaming" I do is "World of Tanks", which is getting great optimizations in the upcoming 0.8.0 version and it runs on Ivt Bridge as it is. I would have bought A10-5700 if it came out along with mobile chips this spring but after waiting for so long with nothing on the horizon I guess I'll just wait a bit longer and go Intel again.


RE: What?
By StevoLincolnite on 9/13/2012 7:09:34 AM , Rating: 2
Lets be honest, yes it is a "Sample list", but that just says they're not confident in the IGP's just "working" for whatever game or application someone intends to run on them.

Take the Intel X3100 and later x4500 series of IGP's, Intel launched them and claimed how great they were for gaming, yet it took them over a year just to enable TnL, Shader Model 3.0 and such which should have been available on launch.
Then to add insult to injury the TnL implementation was pretty poor where Intel opted to use a game profile system for games that benefited from hardware TnL.

Then you need to account for the much much much poorer image quality that Intel has historically had in comparison to AMD and nVidia.

Heck, the mobile graphics is still a shambles in the Atom CPU line with the GMA 950, 3150 (No TnL, no hardware accelerated 1080P playback.)
The GMA 500, 600, 3600, 3650 which are based on PowerVR and still have horrific drivers.

Sorry to say, but Intel drivers still have a long way to go before they ever gain my confidence.


RE: What?
By Reclaimer77 on 9/13/2012 2:49:24 PM , Rating: 2
You appear to be stuck in the past, especially with your comments about image quality. Anantech proved you wrong months ago, you should head over there and get educated.

AMD's salad days were great, no denying that. But it's over for them, and it's been over for a long time. If a slightly better IGP is all you fanbois have left to cling to, well that's just sad.

I would take a vastly superior CPU over 5-10 more FPS in a game, but that's just me. I enjoy gaming, but it's not the end all of life.


RE: What?
By vol7ron on 9/12/2012 11:32:21 PM , Rating: 2
Does it make sense to design games for the ~20 million when (let's say) ~25 million of the ~200 million want to play those games?

At some point you have to give up the fast-food mentality and say, "this is how we make it, eat it or leave it"


RE: What?
By SPOOFE on 9/12/2012 11:40:50 PM , Rating: 2
Eh. Money is money, and gamers are a very demanding bunch. For as much as people seem to be very vocal about innovation, they tend to slavishly follow the pretty, pretty graphics.


RE: What?
By SlyNine on 9/13/2012 12:18:48 AM , Rating: 2
False dichotomy. They can design a game for both and allow the graphics to scale. Does this require some more work and resources, of course. But you are also more likely to catch more of those 20 million in the process.


RE: What?
By NellyFromMA on 9/13/12, Rating: -1
RE: What?
By NellyFromMA on 9/14/2012 2:40:18 PM , Rating: 1
It's true, sorry if you don't like it but it is. Tough life, right?


RE: What?
By jRaskell on 9/13/2012 4:21:46 PM , Rating: 2
Graphics scaling is a lot like cross platform portability. In theory, they're great ideas. In actual practice, they both come with a variety of potential hurdles and headaches that can take significant time and effort to overcome.

And the larger the breadth of scaling you attempt to accomplish, the more hurdles and headaches you are going to have to deal with. Nothing in life is free. That really is a mantra to live by, and graphics scaling is no exception.


RE: What?
By StevoLincolnite on 9/12/2012 11:26:50 PM , Rating: 3
Skyrim is a game that can run on a Toaster. - Being a console port and all.

Only time it truly shines is with mods which would cripple any IGP, heck sometimes even highend cards depending on the mod.


RE: What?
By Motoman on 9/12/2012 11:29:00 PM , Rating: 3
I was going to be impressed...but then I took an arrow to the knee.


RE: What?
By SPOOFE on 9/12/2012 11:41:26 PM , Rating: 2
This man speaks nonsensicalities.


RE: What?
By Flunk on 9/13/2012 8:55:03 AM , Rating: 2
No, he's right. Skyrim is known for being light on computing resources. It will run on pretty much any discrete desktop card made in the last 6 years connected to the lowest-end Core 2 they ever made. It's not terribly impressive, although it is a start.


RE: What?
By Reclaimer77 on 9/13/2012 4:50:04 PM , Rating: 2
Yes but he's missing the point. The point is it's "good enough" to play hugely popular gaming titles like Skyrim. If the IGP can handle things like Skyrim, World of Warcraft, Diablo 3, CS, etc etc, you've pretty much covered 90% of all the gamers out there who might be on the fence about an Ultrabook.

These aren't gaming machines after all, they're all about convenience. Good enough is, well, good enough.


Running Skyrim on Intel IGP
By muhahaaha on 9/12/2012 11:47:49 PM , Rating: 2
They just forgot to say the frame rate. It will run on a Voodoo2 card too, but you'd have to wait 2 minutes for a screen refresh.

Seriously though, with all the great tech Intel puts into their processors, why can't they get graphics right?




RE: Running Skyrim on Intel IGP
By someguy123 on 9/13/2012 12:58:04 AM , Rating: 2
It's pretty difficult and AMD/nvidia have direct relationships with tons of developers yet still manage to cause problems (fans turning off, Rage getting drivers shipped too late while also being the wrong drivers). It wasn't too long ago that nvidia drivers were the cause of 28% of vista crashes.


RE: Running Skyrim on Intel IGP
By piroroadkill on 9/13/2012 6:26:37 AM , Rating: 2
Nope, it wouldn't run on Voodoo2.
It requires DirectX9c support, thus shader model 3, and so at the bare minimum this means a Radeon X1xxx series card, or a Geforce 6xxx series card.


By StevoLincolnite on 9/13/2012 7:13:58 AM , Rating: 2
It *could* run on a Voodoo 2.

Take Oblivion for example which required Shader Model 2.0 at a minimum.
Some clever chaps re-wrote the shader code and did some other under-the-hood changes, called it "Oldblivion" and made the game Shader Model 1.0 compliant; so it could run on a Geforce 3 or Radeon 8500.

Now Skyrim's engine isn't that far removed from Oblivions...
It's just that almost every graphics card made in the last 5+ years has been Shader Model 3 compatible.


I'd love to see video or benchmarks
By TakinYourPoints on 9/13/2012 12:29:42 AM , Rating: 2
I'd love to see benchmarks. Either way, Haswell is going to be AMAZING for ultrabooks next year. The need for discreet GPUs in laptops is going to be further diminished unless displays like in the retina Macbook Pro becomes common in the next year.




By Shadowself on 9/13/2012 12:16:20 PM , Rating: 2
I do believe that for notebooks above the 13" class we'll see more and more high resolution and high dynamic range displays starting next year. Within three years from now, I'd expect those types of displays (for larger than 13" class) will be the norm. For those a discrete GPU will be required until at least 2015 (Skylake) or maybe even 2017 (Skymont).

Haswell (and even more so, Broadwell if they keep improving graphics on both the Tic and Tock steps) will make the ultrabooks the only notebooks anyone will need if 13" class or smaller is enough screen real estate for you.


+100% Cred
By Ringold on 9/12/2012 11:56:30 PM , Rating: 2
Tiffany is a self-proclaimed "Skyrim nerd?"

Was never totally sold on her before, but +100% credibility now in my book.

Oh, yeah, Haswell, thats impressive too. But not as impressive.




FPS?
By SlyNine on 9/13/2012 12:16:59 AM , Rating: 2
Without a frame rate this is meaningless. So what if they both only achieve 15fps. Enough to look good on first glance but annoy the hellout of you while playing.




Runs fine on Sandy Bridge
By Diablobo on 9/13/2012 1:38:07 AM , Rating: 2
Before I got my discrete graphics, I was running Skyrim just fine on my Sandy Bridge i3-2105 3.1Ghz with HD 3000 graphics. It ran smoothly at 1280x1024 with moderate to high detail settings.
I was able to run many of the newer games and practically all of the ones from over two years ago. I couldn't run some games, but for such a low price, it is possible to play with the integrated graphics, at least until you can get a graphics card.
With my limited budget, I was able to get by just fine with the i3-2105, 4GB DDR3 RAM, 1.5TB HDD, Biostar TZ68A+ mobo, and it only cost me about $400. Now with my AMD 6670 1GB DDR5, it will play anything at 1080p with high detail settings.
The graphics card really makes a difference, but I was doing pretty well without it.




Difference between the two screens
By Granseth on 9/13/2012 2:42:12 AM , Rating: 2
There are something odd if you compare the two screens, the one on the right with GT3 has less shadows and different lightning than the one on the left (GT2)
It might be that the two games are just at a different time of the day, but I would guess that would affect the performance too.




Not all uses
By Shadowself on 9/13/2012 12:19:12 PM , Rating: 2
quote:
They will mainly power Ultrabooks and some tablets.
Haswell will, like Sandy Bridge and Ivy Bridge, cover the full range of processors -- from high end tablets to ultrabooks to laptops to desktops to servers. Just as with Sandy Bridge and Ivy Bridge expect higher end desktop and server variants to be GPU free.




"Death Is Very Likely The Single Best Invention Of Life" -- Steve Jobs














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki