backtop


Print 55 comment(s) - last by someguy123.. on Jun 19 at 7:40 PM

Intel aims to dominate the high end gaming market

Intel Corp. (INTC) last year successfully navigated a potentially treacherous die-shrink to the 22 nanometer node with the Ivy Bridge core.  This year it followed up with a redesigned architecture, Haswell on the 22 nm node, which incorporates aggressive power savings and better on-die GPUs.

I. Bye Bye Broadwell? (Till 2015) 

Intel's roadmap has long stated that 2014 would mark the arrival of 14 nm chips with the Broadwell die shrink.  But multiple rumor sites are showing leaked Intel slides that indicate that the 14 nm node is proving particularly onerous.

Thus for 2014 Intel won't be offering a die shrink, but a breather "Haswell refresh".  In some regards this will likely be similar to NVIDIA Corp.'s (NVDAGeForce 700 Series, which kept the same core designs and process node (28 nm), but rebranded parts, shuffling higher performance parts to lower price points.

Intel roadmap

The pace of Intel's die shrinks will reportedly slow in 2014. [Image Source: VR-Zone]

The delay also gives Intel's mobile offerings (which are currently on 32 nm) time to catch up.  The 22 nm, quad-core tablet-geared Atoms (core: Silvermont; SoC: ValleyView; chipset: Bay Trail) are scheduled to hit the tablet market later this year, and hit the smartphone market early next year.

All of Intel's 22 nm Atom product line will be under 18 watts (as a platform, not just the CPU) and all of Intel's mainstream Core i-Series product line will be under 95 watts.

II. Extreme Edition Processors Fill in the Gaps

Intel is also planning a pair of "Extreme" edition releases this fall (Ivy Bridge Extreme) and next fall (Haswell Extreme).  Information on those releases has leaked courtesy of VR-Zone.

With growing controversy over digital rights management (DRM) in the console gaming market -- particularly with Microsoft Corp.'s (MSFT) Xbox One -- personal computer gaming may be poised for a resurgence.  With that in mind Intel's bid to dominate the enthusiast gaming market will center on its "Extreme" branded processors.

Note, that since the 32 nm Sandy Bridge launch the "Extreme" SKUs are now trailing their mainstream counterparts of the same core design by about a year.  This makes for confusing marketing, as the current Extreme edition processors will be a generation behind the mainstream ones in core design, yet both are presumably going to be marketed under the same "Core i-Series" designation (with the only thing hinting at the older architecture being the "Extreme" branding).

Haswell-E
[Image Source: VR-Zone]

But enthusiasts are a savvy bunch, so hopefully they'll be aware of what they're getting with the latest "Extreme" chips.

Much has already been published about Ivy Bridge-E, but the VR-Zone piece deals with its architecture-refreshed successor, Haswell-E.  The leaked slides seemingly confirm the timeline of the previous leak, indicating that Haswell-E chips won't be available until sometime late next year (H2 2014). 

The leaked slides show the Haswell-E chips will (presumably) be Intel's first consumer octacore offering, packing a whopping sixteen threads.  There will also be slightly cheaper hexacore (12-thread) variants.  The cache will be bumped from 15 MB with Ivy Bridge-E to 20 MB in Haswell-E.  Both releases will support up to four PCI-Express 3.0 graphics cards.  But Haswell-E will add a neat trick, supporting the upcoming fourth generation double data rate memory, DDR4-2133.  (Ivy Bridge-E bumps memory support to DDR3-1866.)

Haswell
[Image Source: VR-Zone]

The core speed is shown to be 3.0 GHz, but the slide brags about "more overclocking options for professional and home users", so expect some wiggle room on that number.  (Quad-core Haswells are currently clocked at up to 3.7 GHz.)

Haswell-E
[Image Source: VR-Zone]

Given the assumption that these "Extreme" edition chips will be bought for powerful gaming systems with one or more graphics cards, the dies will lack a built-in GPU like their mainstream equivalents.

Sources: VR-Zone [1], [2]



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

In other words...
By retrospooty on 6/17/2013 6:15:14 PM , Rating: 5
Intel is so far ahead of the competition on x86, that they can now milk it like they did back in the 386-Pentium 4 days. LOL




RE: In other words...
By FITCamaro on 6/17/2013 6:51:26 PM , Rating: 5
Intel definitely was not ahead in the P3 and P4 days. AMD was pretty much in parity with Intel. When the P4 first came out it was terrible. Even the Tautalin P3s were faster than the 1.3-1.7GHz P4s. Especially when Intel moved the general P4 line to SD-RAM. RD-RAM P4s once they got some clock speed were a little faster but at a far higher price since the RAM was so expensive. It really wasn't until the Core line that AMD really started to fall behind.

But yes this is them wanting to sit on an existing architecture a little longer since they really have no reason to move to a new chip. They're already hands down beating AMD on the high end and the power envelope. AMD really only wins on budget devices with better onboard GPUs.

But with more things getting moved to the GPU, AMDs lower prices with as good or better GPUs could actually pay off. I mean they now power both next gen gaming consoles. Of course if Sony and Microsoft just bought the design and will pay to produce the chips themselves, then AMD makes nothing off that.


RE: In other words...
By StevoLincolnite on 6/17/2013 7:32:51 PM , Rating: 3
Agreed.

AMD was definitely faster with it's Athlon against the Pentium 3, untill Intel moved the L2 cache on-die with the coppermine.
When AMD followed suit it regained the performance lead and beat Intel to the 1ghz barrier.

AMD had a big performance advantage all through the Thunderbirds life, it only lost the performance crown towards the end of Athlon XP's life, but only untill the Athlon 64 burst onto the scene and then it became "No contest" for AMD right up untill the Core 2's launch.


RE: In other words...
By CaedenV on 6/18/2013 1:21:11 PM , Rating: 2
1GHz Coppermine was the CPU I built my first system around. It was a great CPU and ran circles around the first few iterations of P4 processors. AMD was a bit faster at the time (and a lot cheaper!), but had known issues with some AV editing software, so I was forced into the Intel ecosystem. Even now AMD has a better overall idea on architecture design, but they seem to be missing something to really make it sing. I love that their module approach allows for any software to take advantage of all cores (unlike Intel's HT which must be specifically called upon), but they need to get their single core performance way up before they are going to get anywhere right now.
But they have been catching up the last 2 years. The only issue is that I feel like Intel has an ace up it's sleeve and they are just waiting for AMD to come out with a killer product before playing it.


RE: In other words...
By BRB29 on 6/18/2013 1:40:40 PM , Rating: 2
I took my Athlon 1700+ cpu to 2.7ghz. It performed better than just about anything out there besides the uber dual cores back then. It was also dirt cheap compared to intel anything.

I believe that was almost a 100% OC because the stock clock was only 1.4ghz. Could be wrong, it's been a long time.


RE: In other words...
By retrospooty on 6/17/2013 9:55:40 PM , Rating: 2
agreed, I worded that badly. I meant to say until the p3 and p4 days. They milked it and AMD caught up. They are far enough ahead now to play that game again.

I'm sure they won't Get caught sleeping this time. They will have something waiting to release on relatively short notice for sure.


RE: In other words...
By UnauthorisedAccess on 6/17/2013 7:09:48 PM , Rating: 2
This is unfortunately very true, though I'm not laughing :|

I can only hope that AMD pushes forward strongly so that Intel doesn't ride this profit train for long.


RE: In other words...
By Shig on 6/17/2013 8:16:57 PM , Rating: 2
Intel looks like they're in good shape, but they're really not imo.

AMD no longer cares about top end x86 performance, neither does most of the chip market. Performance per watt is the metric that matters.

Top x86 performance is good in supercompute, but cpu's these days just feed GPU's in most of the top supercomputers.


RE: In other words...
By someguy123 on 6/17/2013 8:28:03 PM , Rating: 3
But performance per watt is where they have AMD stumped in terms of x86. If you saturate a bulldozer chip you may end up with similar performance, but at a pretty significant TDP cost. Same applies to their server chips (and for a decent shelf 20~40 watt per cpu adds up quickly). When it comes to APUs AMD pretty much dominates laptops in overall performance/watt, but they haven't managed to get anything down to ARM level while intel creeps up a bit every cycle.

I can't see AMD's current business model working out without a breakthrough in watt draw. The mass market they're aiming APU's at doesn't seem to know or care about the performance benefit over intel's integrated.


RE: In other words...
By dgingerich on 6/18/2013 11:52:03 AM , Rating: 3
This isn't so in the server market, though. AMD is doing very well in the midrange a low range (Small and medium sized businesses) market, especially for small virtualization hosts.

I have several Dell R515 machines running VMWare for infrastructure in my lab. I had to keep the cost per machine down below $2500, and only order one per quarter. I did my research, and found for the same price I could either get an R510 with dual 2.2GHz quad core Xeons without HT and 16GB of memory, or an R515 with dual 2.7GHz 6 core Opterons with 32GB of memory. The reviews showed that the 2.7GHz Opterons outperformed the 2.2GHz Xeons, especially when running as virtual hosts. While technically the Xeons were cheaper, the entire platform made up the difference. I wound up getting a better expansion slot arrangement in the process. I'm now running 34 VMs on two R515s from a single FC storage array. I doubt the R510s could have handled that many VMs.

These days, the new R520 is in an even worse performance per dollar position, but Dell has quit making the R515. The R415 vs R420 race is still going, as is the R815 vs R820 (dual socket only) and the R715 vs R720, though, and AMD wins those. Performance per watt is very good on AMDs side when the entire platform is considered. Intel's chipsets are more power hungry than most think.

The Bulldozer/ Piledriver range don't do well in desktop machines because they're weak in two areas: decode and FP, but many server applications, like file server, AD authentication, and web server, don't need those very much. AMD performs much better per watt when using mainly integer units. That's what Bulldozer was designed for. (I hate it, I wanted a good gaming chip from them, but that's the way it is.) That's what it does well, and with performance per watt for the entire platform, that's where they beat Intel.

AMD has nothing to challenge Intel at the top end, like the Dell R820 quad socket and R910, and there are some complications using AMD VM hosts, but they still compete in the lower range brackets. The high end isn't as large as most think, and even the low end server CPU market has some nice profits. AMD isn't out of the game. They're still plenty in it.


RE: In other words...
By ilt24 on 6/18/2013 12:30:20 PM , Rating: 2
quote:
AMD is doing very well in the midrange a low range (Small and medium sized businesses) market, especially for small virtualization hosts.


AMD has it's lowest server marketshare in a decade, so while you might be buying, the overall market isn't.


RE: In other words...
By someguy123 on 6/19/2013 7:40:15 PM , Rating: 2
The opterons make sense in smaller business/single workstations in performance per dollar, but as scale increases so does the hit from the TDP. When people are looking at either ~$1000 more per chip for better performance or 30w more per chip and an additional dedicated transformer to keep up the power requirements as well as a heavier cooling system they're less inclined to go for the more power hungry solution. Large datacenters are where the real bulk sales come from, which is why AMD's total server share is taking a bit of a hit even though their pricing per chip is much lower.


RE: In other words...
By Odysseus145 on 6/18/2013 2:07:50 PM , Rating: 4
If I'm AMD, I see little reason to devote my limited time and resources to competing with Intel's high end offerings. As others have noted, AMD dominated Intel in that segment for years. What did AMD learn from that? People weren't interested. They still flocked to Intel in masses because of old stigmas about AMD cpus' stability. So-called enthusiasts still bought up Intel's extreme edition P4's and PD's, and system builders rarely offered AMD configurations.

AMD still has a lot of work to do, but I think they're on the right track.


RE: In other words...
By BRB29 on 6/18/2013 2:13:09 PM , Rating: 1
are you sure it wasn't intel's sales tactics? you know, the same antitrust laws and the $1b they paid amd?


RE: In other words...
By TakinYourPoints on 6/18/2013 4:21:28 AM , Rating: 3
Yup. I remember starting back in the Athlon XP days where there was a real alternative that kept Intel on their toes. Since the Core 2 Duo dropped there has been really no competition from AMD, and it kind of sucks.


RE: In other words...
By Reclaimer77 on 6/18/2013 8:59:36 AM , Rating: 2
Meah I don't know, I don't see this as "milking".

AMD is the one who's milking honestly. They rather crank the clock speed up on an inferior and wasteful architecture than actually come up with one that's competitive on some level.

Ironically, this is what Intel was doing with the Pentium 4 lol.

For Intel this is the 'new normal'. We can expect to see decent gains in CPU performance with corresponding on-chip GPU improvements. What's impressive is that major efficiency gains are made at the same time.

The theory that if AMD were more competitive Intel would be forced to release Earth-shattering CPU's is a bit speculative imo.


RE: In other words...
By retrospooty on 6/18/2013 10:20:44 AM , Rating: 2
Maybe milking is a bad words... I certainly see Intel as being far ahead now, and they are slowing it down a bit to maximise profits... And why now? They are way ahead and AMD isnt looking like they have anything earth shattering in the pipeline - just more of the same.


RE: In other words...
By jihadjoe on 6/18/2013 5:22:56 PM , Rating: 3
I dont think Intel is slowing things down too much. Its that they perceive the biggest threat to them at the moment is from ARM, so they are directing resources toward making things more power efficient, rather than outright faster.


RE: In other words...
By Mr Perfect on 6/18/2013 6:19:49 PM , Rating: 2
They're not really maximizing profits when they're not giving people a reason to upgrade. Having a compelling new product isn't just to get people to pick you over the competition, it's also to get consumers to upgrade from whatever your last product was. I've seen no reason to upgrade from my Sandy Bridge i7 because the last two releases have been 5% to 10% performance boosts each. With the next generation pushed off another year, it looks like this chip is going to be lasting a while.


Not surprised
By Khenglish on 6/17/2013 7:41:57 PM , Rating: 3
I'm not surprised at all by the delay of 14nm. I also would not be surprised at all if it is outright cancelled.

The low level interconnects are simply getting too resistive, and the resistance goes up for each die shrink. That combined with gate oxides that are literally already 3 molecules thick that cannot be further shrunk to reduce threshold voltages means that maintaining performance and power consumption with die shrinks is pretty difficult. The only way I see this happening is a dramatic reduction in channel length, more than expected for a 22nm to 14nm transition.

Something has to give. Simplest solution I see is to flat out dump FETs and use lateral BJTs. The reason FETs were transitioned to in the first place was that they could be packed much more densely than BJTs at the time, but a lateral BJT is much smaller than the classic vertical BJT. There's even a FET type that has an extra doping layer on the base, vastly reducing base current so that BJTs can be placed into circuits as if they were FETs.

Here's the patent. Can't find the paper on it right now:
http://www.google.com/patents/US20120139009

Ditching FETs is certainly a big move, but if intel figured out how to make their crazy FINFETs for 22nm, I feel that they can get BJTs to work.

As for why NAND is still shrinking, I think they get to use less interconnects than CPU makers, and thus the interconnects they have they can make fatter.




RE: Not surprised
By stadisticado on 6/17/2013 9:15:46 PM , Rating: 3
"I'm not surprised at all by the delay of 14nm. I also would not be surprised at all if it is outright cancelled."

What?

Intel already has full 14nm lines running at their research campuses. As in, the process is ready and they're just in debug mode now. The power/performance thing is a real issue but only as an HVM engineering problem. The basic science to overcome this exists. The main manufacturing problem at these geometries is and will continue to be the lithography.


RE: Not surprised
By retrospooty on 6/17/2013 9:58:00 PM , Rating: 2
yep, we won't be running into those limitations for several generations at least.


RE: Not surprised
By fteoath64 on 6/18/2013 6:25:34 AM , Rating: 2
Yeah,Intel milks every process node for as much as they can since they are about one or two processes ahead of the competition. With chipsets, they have always been one process node behind. But with Haswell, it is the same so no more milking as the competition on 28nm has been very good in doing high performance and low power implementations. The race to 14nm is certainly the game changer as the delay by Intel allows the competition to catch up and match their manufacturing capabilities while making money in moving there. So, Intel is playing a tricky game where they could well be burned if they choose the wrong path. The competition now is not just AMD but the ARM makers who have a variety of Fabs using different processes and are experimenting as they move along. So innovation there is furtile while Intel's monolithic move determines their own path (as well as restricting it).
For the gpu tech, Intel has improved greatly but not enough and still far behind the leaders. Intel's software is still crap and I do not understand why they are not investing heavily in that ?!.


RE: Not surprised
By ilt24 on 6/18/2013 12:42:35 PM , Rating: 2
quote:
With chipsets, they have always been one process node behind. But with Haswell, it is the same


While there will be SOC versions of Haswell, the desktop chips they just released use 8 series (Lynx Point) chipset, made on a 32nm process.


RE: Not surprised
By Khenglish on 6/18/2013 11:47:02 PM , Rating: 3
Just because Intel can make it does not make it better. Intel had engineering samples of tejas, which was a netburst architecture with an even longer pipeline than prescott. It never made it to market because Intel never got the power to be reasonable.

I'm saying that Intel can physically make 14nm CPUs, but I am having a hard time seeing them as performance and power competitive. 32nm to 22nm only worked because Intel also adopted FINFETs to offset the increase in interconnect resistance. They'll need to do something special like that again to make 14nm work.

You said yourself that they already have 14nm research lines. Doesn't it seem odd that they already have lines for 14nm, but product has been delayed to over a year out?

Why do you think that lithography alone is the problem? Yes FINFETs took care of substrate leakage, but how do you think increasing interconnect resistance is going to be handled? What about the fact that the gate oxide is 3 molecules thick, with 2 molecules being too thin to prevent current flow? Intel invested a lot of money to figure out their hafnium stuff in the first place to get the oxide as thin at it is. If you can't make the oxide thinner, then you can't change the doping ratio to reduce threshold voltage and thus operating voltage without murdering performance. Notice how sandy bridge and ivy bridge operate at the same voltages?

Higher interconnect resistance is partially offset by the fact that smaller interconnects also have less capacitance, but not being able to drop the voltage any lower is very bad.


What About Mobile?
By ltcommanderdata on 6/17/2013 7:10:20 PM , Rating: 5
There was talk before about how Broadwell wouldn't be available in socketed form and now they say Broadwell isn't coming to desktop in 2014. Is it that the entire Broadwell family is delayed or is it just not coming to desktop? This could be similar to the 32nm Westmere shrink of Nehalem which was only available for low-end desktop, high-end desktop, server, and mobile whereas mainstream desktop stuck with Lynnfield speed bumps. If 14nm Broadwell is mobile/power consumption focused and is not available in socketed form, then perhaps it'll still be released in 2014, but only on mobile.




RE: What About Mobile?
By RU482 on 6/17/2013 7:29:11 PM , Rating: 2
Yeah...I found it quite odd that the ULV and lower I-core parts aren't on any of those slides


RE: What About Mobile?
By stadisticado on 6/17/2013 9:16:56 PM , Rating: 2
Agreed. The first image clearly says Desktop . There is no reference in any of the slides to the mobile parts.


RE: What About Mobile?
By smilingcrow on 6/19/2013 4:35:51 AM , Rating: 2
There have been enough strong rumours suggesting that there will not be any socketed 14nm CPUs in 2014 that the only new news is that there will be a Haswell refresh.
It makes sense for Intel to focus the 14nm rollout on the mobile lines as that is the area that benefits most and is also the one that Intel has the largest fight on their hands.
For desktop AMD are still behind although they should be able to significantly close the gap with Intel taking a breather.
There supposedly will be 14nm BGA desktop chips next year but those will be niche.


"Extreme", more like Sucker Edition
By Flunk on 6/18/2013 9:17:50 AM , Rating: 2
Paying extra for last-year's tech with a few more cores is highway robbery. If these chips were released before the standard ones then maybe there would be a reason for the price premium. Added to the fact that they don't overclock particularly well with the added cores you'd have to be a real sucker to buy these remarked Xeons. I recommend the 4770K.




RE: "Extreme", more like Sucker Edition
By laststop311 on 6/18/2013 11:36:20 AM , Rating: 2
Some of us like having 40 pci-e 3.0 lanes and quad channel memory. It's not always about the extra cores. I do despise how the extreme chips are released a year after the platform is released but I guess they do this to make sure their yields are at complete full efficiency before they start making the more expensive chips. I don't think I will need to refresh my i7-980x till the 8 core haswell-E and even then i may wait for the first 14nm broadwell-E. It's not a sucker edition for the people that can make use of the features it provides. Time is money. There are many instances where the extra cost of the extreme chp pays itself off over 100x over its lifetime in sped up work or getting more done in same time.


RE: "Extreme", more like Sucker Edition
By laststop311 on 6/18/2013 11:40:01 AM , Rating: 2
And when I get 6 years of near top of the pack pc performance the cpu ends up only costing me 160 a year to buy. Not much more then the 110 per year you pay to have 3 years of lesser performance.


By Flunk on 6/19/2013 9:54:27 AM , Rating: 2
I take issue with that, the -E chips actually have worse single-threaded performance (because of lower clocks), not only that but they're inevitably beaten by the next generation of mainstream processors. You'd end up with 3 years of barely better performance followed by 3 years of sub-par performance (assuming you replace the cheaper CPU every 3 years as you mentioned).

What you essentially are getting is a server processor that's optimized for multithreaded workloads. Unless you're editing or encoding video (if you're into this I totally understand), running servers or something else that's embarrassingly parallel the -E series chips really are a huge waste of money. And not only that, if you really want that type of performance you can buy a Xeon and get Haswell a year before Haswell -E desktops are released.


intel will Winn....
By thomasxstewart on 6/18/13, Rating: 0
RE: intel will Winn....
By thomasxstewart on 6/18/2013 4:54:03 AM , Rating: 1
Reading above commentos' is encouraging. Annund has well above average crew. reason 7nm & 5nm start to fail is softness due to weak lattice structure of material, which is applied in vacumn chamber with heated to gaseous dispersion. just can't work in less/thinner layers,too well.
14nm is ondie gddr5 skymount, broadmore will be ddr4. ddr4 is twice thruput of ddr3 8 GB/s & on die gddr5 is 12x thruput or real game changer. certainly glofo is lacking ,as design,too. lastly, t bird was starting of better cpu for '98 ps2 era startup,although kkeyboard&mouse only changed plug sixe. usb make that work better,latter.. however as glass ceiling is stagnateing development specs, with larger two way cache sroddbar more stable & bit of tech dip till faster broad & much obvivos sky, More watts to near 9.9 experience. p/s blast should push window exp, yet semperon on 780G ddr3 is windows experience:7. very cheap &8/64. real step is all gen3 now. not pci-e2,not usb2,no sata3,, all sata6. all gen 3 is crown of moment. gddr5 is warrior to be waited for, when on intel skymount die. then 2020 next redesign, yet thruput with gddr5 on die is tough Nut to beat.

drashek....


RE: intel will Winn....
By laststop311 on 6/18/2013 11:45:55 AM , Rating: 2
u phale en anglish


RE: intel will Winn....
By freedom4556 on 6/18/2013 6:57:00 PM , Rating: 2
Yeah, that's pretty bad. A Google Translate of your native language would have probably been better.


Explain how this makes any sense at all
By stm1185 on 6/17/2013 7:34:37 PM , Rating: 1
quote:
With growing controversy over digital rights management (DRM) in the console gaming market -- particularly with Microsoft Corp.'s (MSFT) Xbox One -- personal computer gaming may be poised for a resurgence.


Seeing how most PC sales are through digital sources like Steam or Origin, and that those service restrict what the buyer can do with the game more so then the Xbox One will; then how will that DRM on the Xbox One push people to the PC.




RE: Explain how this makes any sense at all
By Alexvrb on 6/17/2013 11:53:36 PM , Rating: 1
I was just thinking that myself. I use Steam all the time (and Battle.net to a lesser extent), and it's not really any different from what Xbox One is doing. Actually, it's less flexible in some ways. I can't loan out games on Steam for 30 days. I can't sell used games at ALL on Steam, not even with a fee and/or developer permission. Steam also allows third party launchers, which depending on the publisher, can be really annoying.


By greenchinesepuck on 6/18/2013 6:26:30 AM , Rating: 2
woohoo steam baby steam! word "steam" makes sonyboys cry and foam, and I love it! PS4? Steam! *cryyyy* hahaha

I mean, seriously, get a sonyboy and ask him what is Steam, you'll get totally TONS OF FUN listening to his babbling. just try it, you won't regret!


By greenchinesepuck on 6/18/2013 6:21:40 AM , Rating: 2
the irony!




Intel facing difficulties
By Lorfa on 6/18/2013 7:57:19 AM , Rating: 2
I think that this delay is because they are usually a few years ahead technologically (privately), but now they're falling behind due the limitations of current methods.




Cool
By jharper12 on 6/18/2013 8:45:09 AM , Rating: 2
Well, this is bad news for the technology enthusiast, but great news for consumers. I kept telling friends to hold up on laptop purchases until Haswell. Now I'll buy a top of the line Haswell, and it'll stay pretty close to top of the line for two years. Awesomeness.




Moores Law is Dying
By ptmmac on 6/18/2013 9:31:26 AM , Rating: 2
Denson scaling died in 2003 or 2004. Intel is delaying a process node because it can and because there is no benefit that is worth the cost. Haswell is over promised and under delivered. It gives a 10% max improvement in the reviews I have read. Intel can't afford to make the next process node until they get more of their Fabs producing on this node and there is no reason to upgrade. I have a 6 year old notebook where I am replacing the keyboard and adding an SSD to get an upgrade. There is no compelling reason to upgrade because Moores Law is dead, and the PC is dying along with it. The only upgrade that can be made now is higher efficiency and that has more to do with software than hardware.

Intel is in trouble, and that is not good for the long term business environment. Why should I buy an Intel chip with it's higher margin price if it is not going to be state of the art or better than the last generation. Intel is already admitting this by how many of the last generation chips it is still selling.




Spotty shopping listings
By VoodooChicken on 6/19/2013 9:18:22 AM , Rating: 2
quote:
...enthusiasts are a savvy bunch, so hopefully they'll be aware of what they're getting with the latest "Extreme" chips.


Well, yeah, but the burden will be on retailers to correctly list what they are selling, and don't get me started on aggregators.




!!
By JessMcGuire65 on 6/19/2013 11:38:31 AM , Rating: 2
Isabelle. I agree that Debbie`s stori is really great, I just purchased a top of the range Maserati when I got my check for $5117 thiss month and a little over $10 thousand this past munth. without a doubt its the nicest job I've had. I actually started nine months/ago and almost straight away started to make minimum $76, per hour. I went to this web-site, wwwEXIT35.COM




PCI-E lanes
By Haydon987 on 6/19/2013 2:52:09 PM , Rating: 2
So... we have fewer PCI-E lanes for graphics in Haswell-E than Ivy Bridge-E?

I know very few people use quad-card systems, and many who do use motherboards that use PLX chips, but why remove an existing feature from an UPgrade???




I like it
By laststop311 on 6/18/13, Rating: -1
RE: I like it
By flyingpants1 on 6/18/2013 3:23:24 AM , Rating: 2
That's sort of like saying, "It's great they only make cars with 100 horsepower, because I drive a Toyota Yaris." Okay, but they could make cars with 400 horsepower.


RE: I like it
By laststop311 on 6/18/2013 11:29:04 AM , Rating: 2
its rly nothing like that, but ok


RE: I like it
By freedom4556 on 6/18/2013 6:47:51 PM , Rating: 2
Actually it is, although his choice of car isn't very compelling. Basically, "I'm glad products aren't improving. This enhances the longevity of my investment." It's a selfish sentiment that notes a personal gain to the detriment of the market. And in any case, I would have compared it to the 20-ish years the Corvette spent with a ~300 hp engine, whilst GM management deliberately kept all other division's cars weaker for marketing reasons (or killed them outright).


RE: I like it
By PaFromFL on 6/18/2013 8:10:38 AM , Rating: 2
But I like having to run out and buy new stuff when computer systems get faster and more efficient. The SSD was a good substitute for a faster processor or graphics adapter, but it looks we're in for a long technological dry spell.


RE: I like it
By danjw1 on 6/18/2013 11:09:49 AM , Rating: 2
I doubt Intel will nerf the the Extreme processors the way they did with the Ivy Bridge ones. I expect you will see additional overhead with these.


RE: I like it
By Ammohunt on 6/18/2013 1:13:56 PM , Rating: 2
I just speced out and ordered two new haswell i5 gaming rigs for the wife and i. I plan on keeping it around for at least 3 years. I really liked my FX-6100 based AMD system until i suffered frequent random BSOD's which i never experienced on any of my previous intel platforms.


RE: I like it
By mikeyD95125 on 6/19/2013 11:34:28 AM , Rating: 2
You don't have to run out and buy anything. The technology will keep improving, so just upgrade when you feel like it (if ever).

Anyway yeah any Nehalem or Sandy Bridge CPU turned out to be a great investment over the past few years. Those chips definitely set a watermark in desktop performance. I think these newer CPUs are going to be good for the person who wants a lot of battery life in their laptop and are willing to pay MBP prices for it.


"I modded down, down, down, and the flames went higher." -- Sven Olsen














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki