backtop


Print 79 comment(s) - last by someguy123.. on May 7 at 10:27 PM

New GPUs will highlight Intel's 22 nm fourth-gen Core chips

As Intel Corp. (INTCwinds up towards the launch of Haswell Intel's 22 nm node architecture refresh and designated CPU core for fourth-generation Core i-Series processors, it's spilling details on the chips' graphics processing unit.  Haswell cores will be paired with three different tiers of GPUs, depending on the power envelope.

I. Enter a New GPU Brand

The top two tiers of the on-die GPU lineup introduce Intel's first ever branded GPU lineup.  Advanced Micro Devices, Inc. (AMD) has Radeon, NVIDIA Corp. (NVDA) has GeForce, now Intel has announced it will call its high end GPUs "Iris".  In the past it relied on humble numbered die-parts with no separate branding (e.g. last generation's Intel HD 2500 and Intel HD 5000).

Intel had briefly toyed with the idea of releasing Larrabee-branded computation-heavy discrete GPUs.  Ultimately Intel abandoned that project choosing to stick on an embedded path, which took it to Iris.

The new GPUs have been referred to as GT3 in past roadmaps (and shown running Skyrim in demoes).  All of the new chips will pack support for OpenGL 4.0 and DirectX 11.1.

Intel GT3/"Iris" GPU running Skyrim [Image Source: Tiffany Kaiser/DailyTech]

On the lowest end power-wise, Intel's 15 watt Haswell chips (like the one presumably powering the company's 17 mm-thick hybrid ultrabook reference design) will get the HD 5000, a mild upgrade.  The performance increase in this segment is expected to be around 50 percent (over the HD 2500).

Intel Iris Intel graphics
[Click to enlarge]I

AMD's Fusion efforts were ultimately a wakeup call to Intel on the value of a high-quality embeddded GPU.  But it appears that the student has now become the master; the performance of Iris and Iris Pro come closer to matching a discrete GPU than AMD's Fusion chips have thus far.

II. Discrete Graphics Performance in an Embedded Package

Things will start to heat up in the U-Series (like Core i5 branded) 28W mobile CPUs, which will get the new "Iris" branded GPU unit (the full name is Intel Iris HD 5100).  It's roughly 2x faster than the HD 4000.
 
Intel Iris

The Iris Pro gets a special boost -- new dedicated EDRAM (embedded dynamic random access memory) is also now for the first time included with the GPU part of the die.  On high-end laptop chips -- the Core i7 branded H-series of mobile chips (47-55W of typical power) -- this is expected to again represent about a 2x speedup.
 
Intel Iris Pro

On the desktop side, Intel's GT3e "Iris Pro" part will get an even bigger boost, reaching a 3x speedup in the R-Series (65-84W) desktop chips.  The M-series laptop and K-Series desktop chips are expected to also have access to Iris Pro, although Intel hasn't revealed their exact level of performance increase.

Intel Iris Pro 2

An Ivy Bridge i7-3770K part scored around 1560 in 3DMark 2011 [source], thus the new Iris Pro-equipped chips should be scoring over 3000, if Intel's performance claims are accurate.  That indicates that the Intel's on-die graphics will be slightly better than a full discrete AMD Radeon HD 7750 GPU which scores around 2900 [source].  
Radeon 7750 HD
The IrisPro on-die GPU is approximately as powerful as last generation's Radeon HD 7750.
[Image Source: AMD]

Whether the real-world performance truly lives up to that remains to be seen, but it's clear that this is Intel's most powerful GPU ever, and worthy of its first-ever branding.

[All slides are courtesy of Intel, via AnandTech]

Sources: Intel via AnandTech, Intel



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Booo
By bug77 on 5/2/2013 11:29:53 AM , Rating: -1
This is the fourth generation in a row where the CPU is left stagnant and reviewers are herded towards the GPU improvements. I'd like to see a single site call shenanigans on intel's marketing.

Another interesting topic would be AMD's inability to catch up despite four generation of no improvement, but I care little about that.




RE: Booo
By Chadder007 on 5/2/2013 11:40:21 AM , Rating: 3
Don't we get this same story literally EVERY YEAR from Intel about improved graphics, but they still suck compared to NVidia and AMD's offerings?


RE: Booo
By bug77 on 5/2/2013 11:53:16 AM , Rating: 2
Don't get me wrong, I'm all for better graphics. What I hate is having that shoved down my throat. And I hate how reviewers fail to point out that if you're not using the embedded GPU (gee, the thought of using the processor for the CPU!), there hasn't been a reason to upgrade in years.


RE: Booo
By Motoman on 5/2/2013 12:25:22 PM , Rating: 2
Virtually no one needs any more CPU power than what's been on the market for the past few years.

And realistically, only a small minority needs any more GPU power - gamers. Who, regardless of all the noise we make, are really a small portion of the overall PC market.

But the fact of the matter is that boosting the CPU power of your gaming rig is unlikely to make any difference in your gaming performance. Upgrading your video card will.

Hence...GPU more important than CPU. Has been for a while...will be for the forseeable future.


RE: Booo
By bug77 on 5/2/2013 12:45:47 PM , Rating: 3
quote:
Virtually no one needs any more CPU power than what's been on the market for the past few years.


Even so, better IPC could mean desktop performance in a 20W envelope. I bet you'd have some use for that.


RE: Booo
By Motoman on 5/2/2013 5:59:24 PM , Rating: 2
That's important to large corporations and data centers.

For the average user with a PC at home? They're not going to notice the power savings.


RE: Booo
By bug77 on 5/2/2013 6:45:05 PM , Rating: 2
Oh you will notice when your laptop will last a day or more on a single charge while being as fast as a desktop 3GHz quad.


RE: Booo
By Motoman on 5/2/2013 7:45:42 PM , Rating: 2
Actually, no I wouldn't. I almost never use my laptop on the battery alone...if I'm near an outlet, I plug in. Saves my battery a lot of discharge/recharge cycles. The battery life of my laptop is all but irrelevant to me.

I use my laptop when I'm sitting on the couch in front of the TV...and there's an outlet *right there*. Or I use it when I'm on the road...and sitting in a cubicle at the office I'm visiting, there's an outlet *right there*. Or in the hotel I'm staying at...they have outlets too.

So...no. I really don't give a rat's a$$ what the battery runtime is on my laptop.


RE: Booo
By inighthawki on 5/3/2013 11:11:26 AM , Rating: 2
And your pretty alone in that. His comment still stands that the "average consumer" will certainly notice lower power draw. Try to imagine that there are people out there that do not have the same priorities as you :)


RE: Booo
By Motoman on 5/3/2013 3:29:38 PM , Rating: 1
...try to imagine that he directed his comment directly at me, personally, and not the "average consumer."

Because he did.

If other people's priorities include senselessly burning out their laptop batteries (and putting up with the low-power performance profile while on battery) then that's their thing. I'm just tired of hearing them whine about having to buy new laptop batteries.


RE: Booo
By inighthawki on 5/3/2013 3:38:16 PM , Rating: 2
Sometimes when people say "you" they aren't directly referring to "only you in this one particular case." Sometimes it is a generic term referring to other people.

e.g. "When opening a can, you would want to use a can opener" - See how that is not necessarily directed directly at you, but is a general statement to people in general.


RE: Booo
By Motoman on 5/3/2013 3:49:57 PM , Rating: 2
Sure...in which one would examine the context in which the statement occurs to determine who the "you" is actually referring to.

I see nothing in the context to indicate he's doing anything but addressing me, individually and personally.


RE: Booo
By inighthawki on 5/3/2013 4:28:09 PM , Rating: 2
That's interesting, because when I read the comment I see nothing in the context that would imply that he IS referring directly to you. I thought he was trying to make a general statement. Maybe we should just ask him :)


RE: Booo
By Motoman on 5/3/2013 6:27:25 PM , Rating: 2
No, you're lying to avoid admitting you're being a duma$s.

Just shut the hell up please.


RE: Booo
By inighthawki on 5/3/2013 9:39:59 PM , Rating: 2
Wow, seriously? I try to be polite about something and you up the rudeness of your remarks? Very mature of you. Internet anonymity at its best.

Good day sir I don't know why I even bothered.


RE: Booo
By Motoman on 5/4/2013 12:46:34 PM , Rating: 2
Good. Maybe then you won't "bother" again. Which would make the world a better place.


RE: Booo
By JasonMick (blog) on 5/2/2013 12:47:22 PM , Rating: 4
quote:
Virtually no one needs any more CPU power than what's been on the market for the past few years.
I disagree somewhat from a computational standpoint.

While it's true per process CPU power has plateaued, there are still very important developments in the CPU ongoing.

First with multiple cores and improved threading (combined with more memory) you can now keep more processes running simultaneously at once. The processes themselves haven't gotten faster, it's the # that can run smoothly that's improved.

Secondly, power has been on a steady decline thanks not only to die shrinks, but to architectural improvements to x86 as well. This is arguably as important or more so than GPU development in laptops.

Thus I agree with part of your premise -- that in single-threaded gaming GPU > CPU -- but I'd like to qualify/disagree with your generalization that this applies to all standard consumer use scenarios.


RE: Booo
By Cloudie on 5/2/2013 2:09:16 PM , Rating: 2
Perfomance per watt is now more important than pure perfomance.

That has translated to vastly better notebook CPUs as desktop class perfomance has filtered down.

Just look at the notebook CPUs from 2010 (the meh that was dual core Arrandale/quad core Clarksfield) compared to a quad core sandy, ivy or haswell. Massive perfomance increase. Dual core is also significantly better though and can be fit into much thinner/smaller form factors.


RE: Booo
By Bad-Karma on 5/3/2013 8:58:22 AM , Rating: 2
quote:
Performance per watt is now more important than pure performance.


That would depend on the user. Someone running a workstation for HPC dependant apps would certainly disagree with you.


RE: Booo
By smilingcrow on 5/3/2013 11:51:05 AM , Rating: 2
Performance per watt also affects Workstations as they are limited by the maximum TDP for CPUs. So for a 150W TDP limit the higher the Performance per watt the higher the performance.


RE: Booo
By Motoman on 5/2/2013 5:58:00 PM , Rating: 2
You're assuming that the vast majority of PC owners use them like you do.

They don't.

Virtually no one needs half of the CPU power they likely have now...and wouldn't notice the difference if you doubled what they have.

Only the small portion of the market that really taxes their computers...and especially gamers...notice any difference. And if you're a gamer, chances are you're already at the "more than enough" level of CPU power - but better graphics always need better GPUs.


RE: Booo
By Bad-Karma on 5/3/2013 9:02:59 AM , Rating: 2
I agree with you as to the PC user base.

But in the microcosm that is the gaming world there are many games, even very new ones, that are more CPU dependent than GPU.


RE: Booo
By inighthawki on 5/3/2013 11:13:45 AM , Rating: 2
There are very few games that are cpu bound and of the ones that are I would claim they're poorly coded.


RE: Booo
By Motoman on 5/3/2013 3:53:26 PM , Rating: 2
I'm not sure that I'd 100% go with the second thought there, but the first one is very valid.

It's a rare game that isn't GPU bound. And if it isn't, the CPU portion of the bottleneck can almost certainly be taken care of with a modest CPU - i.e. your existing CPU must really be a POS.

Virtually any gaming upgrade, if you have even a modest modern CPU, is going to center essentially all on the GPU.


RE: Booo
By someguy123 on 5/3/2013 10:55:37 PM , Rating: 2
That's just silly. "Virtually everyone" has either a very low quality discrete or an integrated system, and the vast majority of people will be saturating their CPU due to poor GPU acceleration, if any. People on GMA integrated are going to peak out their processing power by simply going to youtube/vimeo or trying to play flash games (maybe even browsing websites with flash banners).

"Virtually everyone" will be saturating either their GPU or CPU as APU offerings aren't very powerful and intel HD offerings still have issues with drivers for hardware acceleration. The enthusiasts are the ones more likely to have lots of headroom as they have faster parts for software and games that are incredibly underutilized for browsing or notepad. Average joe caring less about overall performance != lack of utilization.


RE: Booo
By Motoman on 5/4/2013 12:48:48 PM , Rating: 2
You're ridiculously wrong. You honestly think that existing APU designs are going to "peak out their processing power" watching something on Youtube?

You're a tool. Go grab some computer with a low-end AMD APU in it and watch something on Youtube. Then get back to us on how it "peaked out."


RE: Booo
By someguy123 on 5/7/2013 10:27:02 PM , Rating: 2
"Virtually everyone" has an intel system with integrated, most likely still GMA, not an APU. I like how you use ad populum only when it suits your argument. Intel's crap gpus have the bulk of the market and the only way "virtually everyone" would be under utilizing is if modern chips became cheap enough, which requires, you guessed it, faster chips/smaller processes to justify dropping price.


RE: Booo
By sorry dog on 5/2/2013 6:04:34 PM , Rating: 3
quote:
Virtually no one needs any more CPU power than what's been on the market for the past few years.


Or 64K for that matter, right Mr. Gates?


RE: Booo
By Motoman on 5/2/2013 7:46:33 PM , Rating: 2
Walk around an office building and double the CPU power in every computer on every desk.

Then watch and see that essentially no one notices.


RE: Booo
By FITCamaro on 5/3/2013 7:59:29 AM , Rating: 2
Well my office would notice. But then we're doing software development. Not spreadsheets.


RE: Booo
By ShieTar on 5/3/2013 9:54:49 AM , Rating: 3
I'm doing spreadsheets, and I would notice too. Doing post-processing of orbit simulations results available as 10.000-line excel spreadsheets, where the formulas are a bit more non-linear and complex than the simple additions of a financial spreadsheet means I can update a few parameters, press 'F9' to update formulas and go drink a coffee while my Business-Laptop goes into 10 Minutes of calculations.

I don't do enough of this to go and buy a workstation, but I sure was happy with the latest roll-out replacing the old core 2 duo with a new core i5, cutting computation time in half.


RE: Booo
By Rukkian on 5/3/2013 1:21:28 PM , Rating: 2
I can't speak to your situation directly, but most of the time there is another bottleneck, and usually that is the hard drive. Hard drives usually end up being the culprit to slowness in computers.

In most situations, without a faster hard drive, a faster processor will not make a huge difference if you have a fairly up to date cpu.


RE: Booo
By Motoman on 5/3/2013 2:05:10 PM , Rating: 2
...and I would imagine that both you and FIT understand that your PC usage is far from the norm. The vast majority of people aren't compiling code all the time, or doing orbit simulations.

So for you two guys...sure. Crank up your CPU all you like. But the vast majority wouldn't notice.


RE: Booo
By InsGadget on 5/4/2013 10:18:12 AM , Rating: 2
A more powerful GPU helps a lot more than just games. Your desktop environment will feel more snappy too.


RE: Booo
By Motoman on 5/4/2013 12:50:25 PM , Rating: 2
Not unless you're plodding along on some truly awful graphics chip. If you have even the cheapest AMD APU, or any Intel thing from the 3000 up, you already have more graphics power than you're ever going to need for the desktop environment.


RE: Booo
By Mitch101 on 5/2/2013 11:41:17 AM , Rating: 2
It might not be a top performer but if Intel can make a top end CPU they are smart enough to figure out how to make triangles fast eventually having a good enough GPU for even gamers.

I recall one of the GPU companies saying they should be able to obtain Jurassic Park in real time by 2016. Once you reach reality where do you go? Probably better physics. If Intel is 2-3 years behind then the play field becomes level around 2020.


RE: Booo
By mackx on 5/2/2013 3:13:20 PM , Rating: 2
to be fair, without physics it isn't "real"


RE: Booo
By Jeffk464 on 5/2/2013 5:26:21 PM , Rating: 2
Sad thing is they can already make games that look way better than what you see now. Games are all written first for consuls now and then brought over to pc's. Consules have really held gaming back.


RE: Booo
By Garrettino on 5/2/2013 9:58:00 PM , Rating: 2
This is so true and it makes me really sad. Stupid consule gamers!!!


RE: Booo
By Motoman on 5/4/2013 12:52:47 PM , Rating: 2
The only game I have any interest in that's affected that way is DC Universe Online.

There's not enough graphics controls to make it look good...they put in enough stuff for it to work on the PS3 and then just left it like that. It desperately needs more controls for antialiasing and anisotropic filtering at a minimum.


RE: Booo
By FITCamaro on 5/2/2013 12:53:40 PM , Rating: 4
It's slightly faster, more power efficient, and offers a more powerful GPU. How is that stagnant?

Processors are getting to levels where people don't need ones more powerful. They just want ones that use less energy. The focus is on the mobile/laptop space. Not the high end gaming rig.

Sure eventually they'll move on from the current architecture. But right now providing some speed boosts while cutting power consumption is the priority.


RE: Booo
By retrospooty on 5/2/2013 1:58:15 PM , Rating: 4
Exactly... Even nerds like me stopped. I upgraded my PC's on average probably every 6 months since the Pentium 1 was out. Like an obsession, wasting countless dollars on it... But even I stopped at Sandy Bridge. I put it in 2 years ago, skipped Ivy Bridge and am skipping Haswell as well. There just isnt anything I do that needs more than a quad core i5 Sandy Bridge. That will run any game and pretty much anything else I can think of at amazing speeds. There just isnt a compelling reason to upgrade, even for the techies.


RE: Booo
By Jeffk464 on 5/2/2013 5:27:43 PM , Rating: 2
Yup, I'm pretty much in the same boat.


RE: Booo
By Jeffk464 on 5/2/2013 5:28:37 PM , Rating: 2
Core i5 SB is just an outstanding performer.


RE: Booo
By tamalero on 5/3/2013 1:25:17 PM , Rating: 2
corei5, specially if you have an unlocked one.. is a godsend..
need a bit more perf? just overclock it a bit more!.


RE: Booo
By dashrendar on 5/3/2013 11:34:31 AM , Rating: 3
What are the chances that you were in your teens or twenties when you were doing major PC upgrading and now because you're in your 30's or 40's, married and have kids you don't time to do it anymore.

No, I'm not describing my life. ;)

Actually, I am.


RE: Booo
By CZroe on 5/4/2013 11:09:57 AM , Rating: 2
Similar to me too. :)

I stopped years ago with an OC'd Q6600 G0. Several years later and all it really needed was an OC bump (now at 3.0GHz), an SSD, and GPU upgrades (now running tri-SLI with PPU). Well, there's a Core i5 2537M in my Alienware notebook, but both systems could use better graphics these days.

I'll admit: I do a lot of my encoding stuff on my twin brother's i7 system. :P


RE: Booo
By UnauthorisedAccess on 5/3/2013 12:14:27 AM , Rating: 2
I fell that I've been waiting for that '50% better' improvement in CPU processing power before I open my wallet and I get disappointed after each announcement shows a 15% improvement with a large focus on on-die graphical prowess.

Phenom II to Bulldozer wasn't great. Bulldozer to Piledriver wasn't great. Sandy Bridge to Ivy Bridge wasn't great. Ivy Bridge to Haswell is showing a 15% improvement in CPU, so not great.

Yes, I know on-die GPU is the focus and that's showing massive improvements. I personally won't be interested until they negate the ~$150 [7790] graphic card segment (they're already encroaching on the ~$50 [7730] to maybe $100 [7750]) as then I'll maybe consider not buying a graphics card. I know that we have to go through this pain to get there but I feel that we've worn this pain for over 2 years and don't, as the fun passionate consumers we are, deserve another 2 years of 15% CPU improvements.


RE: Booo
By FITCamaro on 5/3/2013 7:58:45 AM , Rating: 2
Yeah my gaming rig is an i5 2500k running at a modest 4.2GHz with 8GB of RAM currently and a 7950. It's already almost a year old and I expect it to last me another 2-3 years at least.

Some people complain about consoles holding back PC gaming. Honestly I've stopped caring about the graphics so much and just want quality titles with a plot and good gameplay. So I care more about that. Console gaming holding games back a bit on the graphics department has saved me the money of constant system upgrades that were kind of necessary nearly every year or two in the early to mid 2000s.


RE: Booo
By Rukkian on 5/3/2013 1:31:24 PM , Rating: 2
The problem is that while the graphics are dumbed down for consoles, so are the story and plot lines. Console titles have always been more flash than substance and have shown that people are willing to buy the same thing year after year and year with just minor tweaks, so why bother making a quality title.


RE: Booo
By deathwombat on 5/2/2013 3:24:20 PM , Rating: 2
It's still stagnant on the CPU side.

I'm from the era when a computer was utterly obsolete after 5 years. In rougly 5 year intervals I upgraded from a Z80 4MHz to a 286-20 to a 486DX 33 to a Pentium II 350 to an Athlon XP 2400+ to a Core 2 Quad 6600. There was never anything wrong with the computers that I upgraded from; they did everything that I wanted them to do. I wanted more performance because it was available, and I found ways to use it once I had it.

So, I'd been using my Core 2 Quad since 2006 and, by 2012, I figured it was time for an upgrade. I knew that the base clock speed on a Core i7 3770 was only 41% faster than what I already had, but I figured that with the addition of HyperThreading, TurboBoost, new instructions like AVX, and improvements to performance per clock, a new computer would crush my Core 2 Quad in the CPU-based distributed computing projects I like to run.

Well, HyperThreading didn't help and usually hurt performance, so I turned that off. In the end, my new computer outperforms my old one by an average of 50% in my DC projects... exactly the same amount that the clock speed increased by when TurboBoost is running on all four cores. 6 years -- more than half a decade! -- and only 50%. That would be good in any other industry, but I'm used to 1000%. Had I known the performance boost would be that small, I wouldn't have wasted my money. A 7-year-old computer is perfectly good today. In fact, by 2016, I doubt that my decade-old Core 2 Quad 6600 will need to be replaced. My wife loves it!

So, now that we're in an age where clock speeds only creep upwards and performance per clock is essentially stagnant, what can be done to entice someone to buy a new computer? Well, Moore's Law still lets Intel double the number of transistors every two years. My DC projects all scale perfectly as the number of cores increase. If I had 4 cores in 2006, new CPUs could have had 8 cores in 2008, 16 cores in 2010, and 32 cores in 2012. Instead, Intel's mainstream chips have stayed at 4 cores, and they've given half the die to a graphics chip that I don't even use!

If Iris is as good as Intel claims, maybe Intel's integrated graphics will eventually be good enough that I won't bother with a discrete graphics card. In the meantime, the stagnation that the OP complained about has left me with no reason to upgrade my now 7-year old computer, and made me a sucker for doing it. Barring some fundamental change like 10+ GHz CPUs made possible by graphene or diamonds, I highly suspect that my Core i7 3770 is the last CPU I'll ever buy. I can't do anything with it that I couldn't do with my computer from 2006, so if a CPU from 2006 is still competitive, between the decline in PC sales, the lack of competition for the performance crown, and the threats to Moore's Law posed by the laws of physics and Moore's Second Law (aka Rock's Law), will there ever be a desktop CPU that can render an i7 3770 obsolete?


RE: Booo
By retrospooty on 5/2/2013 3:36:01 PM , Rating: 2
That's the thing though... If your new CPU was 10x faster than it is now, it still wouldn't operate much faster at all. Almost nothing you do is bottle-necked at the CPU. Open up your task manager and watch it for a while. Your CPU's are probably averaging 1 to 5% utilization most of the day. Faster CPU's arent needed unless some software comes out to utilize what we have now.


RE: Booo
By deathwombat on 5/2/2013 4:19:54 PM , Rating: 3
My CPUs are at 100% all the time. (Or at least they were. I'm taking a break from distributed computing projects at the moment.) There are still applications that will never have enough power, like DC projects that simulate protein folding, find prime numbers, crack encrypted messages, or search for extraterrestrial life. Chess engines will never fail to benefit from being able to search more moves per second.

But yes, there is no need for more power to do most of what we do now. As I pointed out above, when more power became available, new technologies came along to take advantage of it. What would we do if we suddenly had 10 or 100 GHz CPUs? I don't know, but someone would find a use for it, and it might forever change the way we use computers. If you build it, they will come.


RE: Booo
By retrospooty on 5/2/2013 5:38:12 PM , Rating: 2
LOL... True, but the thing about distributed computing is you are basically giving out your idle/spare CPU processing power to whatever cause... It works because there SOOOOOO much free CPU power already. OTher than that and a few other niche things that a very small percentage of people do most CPU's on PC's are less than 5% utilized, way less.


RE: Booo
By Motoman on 5/3/2013 3:57:21 PM , Rating: 2
Well...I learned the hard way about the "free" CPU power you're talking about.

This is a long time ago, mind you...but back in the day I was all about SETI@Home. I had all kinds of PCs running all the time, doing the SETI command-line client. I was in something silly like the top 0.2% of all SETI users worldwide in terms of workunits done (out of something ridiculous like a hundred million users).

Then I had a thought. And for a month, I stopped running all my PCs with SETI all the time. When my electricity bill came for that month, it was about $100 lower than it usually was.

So then I stopped running SETI XD


RE: Booo
By Jeffk464 on 5/2/2013 5:34:12 PM , Rating: 2
quote:
so if a CPU from 2006 is still competitive, between the decline in PC sales, the lack of competition for the performance crown, and the threats to Moore's Law posed by the laws of physics and Moore's Second Law (aka Rock's Law), will there ever be a desktop CPU that can render an i7 3770 obsolete?


Pretty much the cpu world has become pretty yawn worthy. All of the current excitement and big improvements are in phone and tablet computing. It still takes a ridiculous amount of money in graphics cards to max out my core i5 Sandy Bridge.


RE: Booo
By ShieTar on 5/3/2013 7:24:45 AM , Rating: 2
Why are you complaining that the 2012 300$-CPU only outperforms your 2007 (Core 2 Quad wasn't even released in 2006) by 40% in one specific, badly optimized software? I mean, first of, the 3770K outperforms a Q6600 by an easy 100% in most benchmark tasks, see http://www.anandtech.com/bench/Product/53?vs=551 .
Second of all, you should compare the 1k$-Release CPUs from both years, thus http://www.anandtech.com/bench/Product/53?vs=551 with a 200% performance boost in Multithreaded applications.

And sure, the Q6600 outperformed a P3 600MHz from the year 2000 by a factor of 25 instead of the factor of 3 you got now, but it also incread the TDP by a factor of 6 to 7.

Oh, and here is a tip: If you do not care about the GPU part, don't buy it. You could have gotten the CPU-Performance of the 3770K without the GPU-part as a Xeon 1230v2, for 2/3rds of the price-tag.


RE: Booo
By ShieTar on 5/3/2013 7:27:28 AM , Rating: 2
Sorry, the 2nd link should have been different from the first one of course:

http://www.anandtech.com/bench/Product/53?vs=552


RE: Booo
By theapparition on 5/2/2013 3:54:54 PM , Rating: 2
quote:
This is the fourth generation in a row where the CPU is left stagnant and reviewers are herded towards the GPU improvements.

I have no idea what you are talking about. Newer CPUs can be significantly faster.

Just because web browsing and game benchmarks don't show much improvement, doesn't mean that one can't get massive increases in computing power. For some of my applications, what took weeks has turned to days.


RE: Booo
By FITCamaro on 5/2/2013 4:46:30 PM , Rating: 2
Yeah another advancement that is unseen for most people has been the improvement in virtualization support in CPUs.


RE: Booo
By ShieTar on 5/3/2013 7:07:27 AM , Rating: 2
You are joking, right? The Core i7-3960X achieves a factor of 2-3 on CPU performance versus that of an Core 2 Extreme QX9770. And there are no indications that Haswell won't be able to push this up by annother 5% to 10%.

The GPU side just gets more reporting because it is more exciting right now, still improving exponentially rather then in a more linear fashion of modern CPU development.


"You can bet that Sony built a long-term business plan about being successful in Japan and that business plan is crumbling." -- Peter Moore, 24 hours before his Microsoft resignation














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki