backtop


Print 35 comment(s) - last by kileysmith104.. on Jun 9 at 12:30 PM

Tablets should be here in time for holiday 2013; 22 nm smartphone chips likely shelved till 2014, though

Intel Corp.'s (INTC) fourth generation of Core i-Series processors is upon us.

I. Meet Haswell

At the 2013 Computex trade show in Taipei, Taiwan, Intel announced the availability of its latest personal computer CPUs, powered by the new 22 nanometer Haswell core design, an architecture redesigned from last generation's 22 nm Ivy Bridge.

Here's a table of the SKUs announced by Intel:
Fourth generation core i-Series
Note, the models in Intel's tables show only the processors model geared towards mobile and pre-packaged desktop processors (denoted by the 'HQ', 'R', 'Y', and 'U' lettering), which are ball grid array (BGA) designs soldered by OEMs onto their circuit boards.

Newegg.com currently lists a variety of socket-style desktop Haswell i5- and i7-Series chips ranging from the 3.0 GHz quad-core i5-4430 ($189.99 USD) up to the 3.5 GHz quad-core i7-4770K ($349.99 USD).  These chips either have no letter, an 'S', or a 'K', with the letter indicating clock speed ('K' being the highest/most-expensive and 'S' the lowest/least-expensive).

Haswell die
A colorized Haswell die

System builders will have to buy a new socket; Intel has moved to LGA 1150.  The good news is that the socket will be compatible with next year's die-shrink, Broadwell, just as the LGA 1155 introduced by Sandy Bridge was compatible with Ivy Bridge.

II. Iris Pro Shines, But Lower-End iGPUs in Desktop Chips Struggle

According to early benchmarking by AnandTech, the new chips offer particularly impressive double digit floating point and x264 video processing gains over Ivy Bridge.  Overall the processor managed about an 8.3 percent average bump over Ivy Bridge in CPU performance in various tests, a 17 percent average bump over the two-generation-old Sandy Bridge, and a 44 percent bump over Nehalem.

Haswell
While the new architecture is more of an interative refinement of the Core design, rather than a reinvention, it nonetheless delivers solid gains.

The story on the side of the integrated on-die GPU is a tail of two chips.  Mid-range desktop models like the i7-4770K ($349.99 USD) pack a HD 4600 GPU.  While more or less living up to Intel's claims of a 50 percent bump in frame rates over the previous generation (HD 4000), Intel's performance is still pretty abysmal compared to Advanced Micro Devices, Inc. (AMD).

For example, the AMD A10-5800K ($129.99) ties or beats the HD 4600 in most game benchmarks by AnandTech.  In some cases the win appears to be enough to make a game that would be unplayable on the Intel chip playable on the AMD chip.

Is it amazing to see AMD's Fusion processors compete against a chip that is nearly three times as expensive and manage a modest win?  It is pretty astounding and a testament to AMD's targeted strategy -- building gaming-ready systems on a budget.

Of course the Intel chip destroys its rival in CPU performance, but that's to be expected when you're paying approximately 2.7 times as much.

On the other hand, the Iris Pro (the HD 5200) -- the integrated GPU found in pricier Haswell SKUs -- is a completely different story.  Not only does it blow away its Fusion foe, it trades blows with an NVIDIA Corp. (NVDA) GeForce GT 640, a discrete graphic card that retails for around $79.99 USD.  The key to that performance is a 128MB L4 cache, dubbed "Crystalwell".

So what's the price and why aren't we finding this mean performer on Newegg?

The bad news is that high-end desktop parts don't get Crystalwell -- you'll only see it (and Iris Pro) in ball grid array (BGA) designs from OEMs, which are soldered to the motherboard of high end laptops and mid-range desktops.  Thus the price premium to get Iris Pro and Crystalwell is unknown.

Crystalwell
Crystalwell makes Iris Pro a stud, but isn't available in socket-style SKUs yet.

Also up in the air is how the situation will look when Iris (HD 5100) -- the little brother of Iris Pro -- lands.  Intel's first batch of processors did not include any chips with the mid-range Iris.

Thus the situation is that CPU-wise Intel remains the king of single and multi-threaded performance, albeit with pricey chips.  But in terms of gaming or performing other graphics heavy functions without a GPU, Intel's pricey chips fall flat -- except the Iris Pro.  But Iris Pro and its Crystalwell are only going to be found in a few desktops, plus a smattering of gaming laptops and pricey ultrabooks.

Thus we're left with a release in which Intel makes good on nearly all its promises and shows some great performance, but leaves something to be desired in terms of selection and flexibility for system builders.  

Perhaps follow-up releases will fill in those gaps, but for now builders of budget (GPU-less) desktop systems must either sacrifice the ability to upgrade their CPU for better performance at a higher price (going the Intel route), or pay less and retain upgradability albeit at lower performance (going the AMD Fusion route).  As budget gaming is today one of the biggest PC markets, it should be interesting how the sales play out.

III. Mobile

The final piece of the story is the mobile side.  

Intel is boasting that its latest chips allow "9 hour" battery life via designs reaching as low as 7 watts (most of the i3-Series chips announced have 15 watt TDP).  As previously discussed, standby battery life is expected to receive an even bigger boost.

The chipmaker's Executive Vice President Tom Kilroy boasted of "more than 50 different 2-in-1" (ultrabook-cum-tablet) design wins.  But Intel's flagship tablet architecture -- the Intel Atom line -- won't receive its update until later this year.

Haswell ultrabooks and tablets
Intel's 2-in-1 lineup is starting on the pricier end with Haswell based designs.

Intel promises that even slimmer form factors with 8 hours of battery life will be available this holiday season with the release of 22 nm, quad-core tablet-geared Atom design(core: Silvermont; SoC: ValleyView; chipset: Bay Trail).  

Thus far Intel tablets haven't sold great.  But the company is rumored to be scoring some big Android design wins, even ahead of Silvermont and Intel is also hopeful than Windows 8.1 will outperform its embattled predecessor in sales.  The best-case scenario for Intel is a dogfight between hot-selling Android and Windows 8.1 tablets, with Intel acting as the arms supplier on each side of the battle line.  Whether that plays out, though remains to be seen.
Intel Atom Z2760
The next generation of Atom SoCs will arrive in time for the holidays.

The company is facing strong resistance from entrenched ARM chipmakers like NVIDIA Corp. (NVDA) and Qualcomm Inc. (QCOM).  It will likely have to prove that its x86 Silvermont designs can not only beat these rivals' ARM-powered designs on computing performance, but also price and power performance.

Intel will look to add value by pairing its next-generation Atoms with its multi-mode LTE die, the Intel XMM 7160, in is holiday partner products.  That chip should see more traction in upcoming smartphone product, however 22 nm smartphone Atom chips are not expected to trickle out until early 2014.

Sources: Intel [press release], Intel [Haswell product page]



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Biased garbage spewing in excess
By BSMonitor on 6/4/2013 5:06:13 PM , Rating: 2
Yawn, Mick F's up another Intel article.

"Is it amazing to see AMD's Fusion processors compete against a chip that is nearly three times as expensive and manage a modest win? It is pretty astounding and a testament to AMD's targeted strategy -- building gaming-ready systems on a budget.

Of course the Intel chip destroys its rival in CPU performance, but that's to be expected when you're paying approximately 2.7 times as much"

LMAO




RE: Biased garbage spewing in excess
By stm1185 on 6/4/2013 5:27:31 PM , Rating: 3
Show of hands who has built a gaming-ready system on a budget without a dedicated GPU...


RE: Biased garbage spewing in excess
By x10Unit1 on 6/4/2013 5:37:08 PM , Rating: 2
Will a "Not worth reading" work?


RE: Biased garbage spewing in excess
By Samus on 6/5/2013 2:05:43 AM , Rating: 3
I agree. They need to make a GPU-less version of this chip. It's a complete waste of die area (and money) for people who are 100% going to use a dedicated video card.


By BRB29 on 6/5/2013 9:10:55 AM , Rating: 2
They did, it's called the FX series.


By phazers on 6/7/2013 3:54:19 PM , Rating: 2
quote:
It's a complete waste of die area (and money) for people who are 100% going to use a dedicated video card.


Not true. I build a 3770K system with a factory oc'd HD7970 GPU and using Virtu MVP I get around 10-15% higher frame rates with the combination compared to the 7970 alone.


RE: Biased garbage spewing in excess
By bebimbap on 6/4/2013 7:15:40 PM , Rating: 2
I consider myself Neutral on the subject of AMD and Intel, if AMD offered superior performance and similar costs, I would buy AMD however AMD with mobo at similar single threaded performance usually costs more than intel or that AMD just can't deliver the performance I require. I shop at Microcenter so these things are cheaper for me so comparisons like this are different for me than others.

Also you can't compare Intel's Flagship 4770k with a A10. Would be similar to comparing $105k Porsche Carrera 4S with a $42k Toyota Avalon. yes a Porsche costs 2.5x as much and they are both cars. If that's all that was different about the two no one would ever buy or want a Porsche.

a better comparison would be an i3/pentium of sandy/ivy/haswell line with a A10 because even AMD admits to having nothing that competes with Intel's top of the line products.

I don't think it's impartial to make the comparisons you did. I know how you say you are just reporting the impartial truth but you implied some big opinions. Though you could have done it unintentionally I'm leaning towards Jason Mick is a smart enough guy to imply whatever he wants to in his writing, and professional enough writer to proof it too.


RE: Biased garbage spewing in excess
By Cheesew1z69 on 6/4/2013 7:48:27 PM , Rating: 2
quote:
however AMD with mobo at similar single threaded performance usually costs more than intel
Um...lol


RE: Biased garbage spewing in excess
By bebimbap on 6/4/2013 9:05:13 PM , Rating: 1
I laugh too, the best AMD chip now is about the same as a i5-2500k (2.5 years old) in gaming performance. no one sells the 2500k anymore because it's that old, but when it was sold, the prices where about this.

AT Microcenter
i5-2500k + Asus Sabertooth = ~340
fx8350 + Asus Sabertooth = ~345

even right now the 4670k + sabertooth bundle is 420.

Even if they were exactly the same, The games I pay such as BL2, Intel wins.
http://www.techspot.com/review/577-borderlands-2-p...
and FX 8150 at 4.5ghz can only get 51 fps with a gtx 680 while a stock 2500k (3.3ghz) gets 61 fps. now you can say a 8320 or 8350 is 30% faster per clock (which is a lot ) and OC higher you'll achieve about ~70fps but, the same old 2500k can OC to 4.5-5.0ghz and get +75-80fps.

so based on my personal needs, buying amd forces me to upgrade more often seeing as though an obsolete i5-2500k can match the performance of a current FX-8350BE in games.

I build/buy AMD computers, but the cpu get upgraded every 2 years and Mobo every 4 years, while the Intel systems get upgraded entirely every 4 years. Though I don't have to upgrade the MOBO with the cpu, AMD cpus are not free. so I think of it as AMD cpu x 2 + mobo vs Intel cpu x1 + mobo. Different people have different needs. Enterprise level corporations use both intel and amd in their server rooms and direct different workloads at them, though more enterprises are leaning towards ARM OR intel at the moment as AMD is losing a lot of ground in that space. But back to the point, in my case because of the upgrade cycle Intel is cheaper at the same single threaded performance level. or you can say at the same costs, Intel offers more performance.


By BRB29 on 6/5/2013 9:09:06 AM , Rating: 2
990fx sabertooth is $185.
z87 sabertooth is 260
79 sabertooth is 350

fx 8350 is 179
3570k is 189

Microcenter has the same cpu+mobo bundle deal for both amd and intel systems.

FX 83xx is not 30% faster per clock than 81xx.

AMD's upgrade path generally is longer than Intel's.


RE: Biased garbage spewing in excess
By BRB29 on 6/5/2013 9:17:32 AM , Rating: 2
quote:
I consider myself Neutral on the subject of AMD and Intel, if AMD offered superior performance and similar costs, I would buy AMD however AMD with mobo at similar single threaded performance usually costs more than intel or that AMD just can't deliver the performance I require. I shop at Microcenter so these things are cheaper for me so comparisons like this are different for me than others.


It's not 1997, single thread performance is still relevant but multithread has been here for a while and most new games use it.

quote:
a better comparison would be an i3/pentium of sandy/ivy/haswell line with a A10 because even AMD admits to having nothing that competes with Intel's top of the line products.

And if you did compare them, the value is on AMD's side. The CPU performance is overall equal with intel win in single and AMD in multi. AMD wins hands down in GPU side. Intel wins in power consumptions.


RE: Biased garbage spewing in excess
By bug77 on 6/5/2013 9:52:04 AM , Rating: 2
quote:
And if you did compare them, the value is on AMD's side. The CPU performance is overall equal with intel win in single and AMD in multi. AMD wins hands down in GPU side. Intel wins in power consumptions.


I keep saying this: AMD's GPU advantage is zero in practice. If you play game, you need a dedicated video card. If you don't, the solution from intel is more than enough.

Not sure what you mean by "intel wins in single" and "amd wins in multi".


RE: Biased garbage spewing in excess
By bug77 on 6/5/2013 10:16:24 AM , Rating: 2
Also, AMD is burning through more power than intel, which is not what I'd expect from a modest CPU.

See A10-6800/6700 vs i7-4770k: http://www.hardwarecanucks.com/forum/hardware-canu...


RE: Biased garbage spewing in excess
By BRB29 on 6/5/2013 10:49:02 AM , Rating: 1
It's a desktop chip. It's burning through more power because it gave you more fps in games. People don't buy desktops to save power.

Let's look at a more balanced comparisons for performance.

http://www.legitreviews.com/article/2209/11/

And total system power for that set up
http://www.legitreviews.com/article/2209/12/

For an entry level gaming system, it is the best you can get right now at that price range.

Comparing an A series apu to a high end intel system is ridiculous. A series wasn't meant to have more CPU power. It was meant to have a good balance of cpu + gpu. Almost all FM2 systems will probably never see a discrete card in there. Putting a gtx680 to test gaming performance on an A series is as dumb as buying a Toyota Camry to drag race.


By someguy123 on 6/5/2013 8:45:53 PM , Rating: 2
He's talking about the power consumption. A series is, as you said, focused on its gaming performance, yet it draws 20watts more in prime, which is only testing the CPU. Haswell seems to have higher overall draw on desktops as well since they don't seem to be benefiting from the VRM shift to the die, yet its well below the A series in gaming and synthetic CPU testing for power draw.

AMD is clearly ahead in terms of integrated graphics, but it's far behind in watt efficiency and single thread performance. If you look at current benchmarks the 8350 has been struggling to remain competitive in even multithreaded performance. Value is much better but performance and power draw are behind.


RE: Biased garbage spewing in excess
By BRB29 on 6/5/2013 10:53:47 AM , Rating: 2
quote:
I keep saying this: AMD's GPU advantage is zero in practice


lol that doesn't even make sense.

quote:
If you play game, you need a dedicated video card. If you don't, the solution from intel is more than enough.

Most people games on their computers. Most computers don't have discrete cards. Don't assume you are everyone. Most people I know won't spend $100+ on a video card to play games. If they were to do that, they would rather just buy a console.
An A10 lets them play games casually without the additional cost. What's so hard to understand?


RE: Biased garbage spewing in excess
By bug77 on 6/5/2013 11:08:47 AM , Rating: 2
quote:
What's so hard to understand?


I don't know, maybe AMD's market share if all you said were true?


RE: Biased garbage spewing in excess
By BRB29 on 6/5/2013 11:12:52 AM , Rating: 2
Your new current argument is market share determines cpu performance?


RE: Biased garbage spewing in excess
By bug77 on 6/5/2013 11:36:48 AM , Rating: 2
You argued that most people are served ok by AMD. My argument was that you're full of it.
See here why: http://store.steampowered.com/hwsurvey/processormf...


RE: Biased garbage spewing in excess
By BRB29 on 6/5/2013 11:43:23 AM , Rating: 2
You must have missed over a decade of news about intel businesses practices. I get it. You're an intel fanboy. Just come out and say it instead of going in circles.


By bug77 on 6/5/2013 12:15:12 PM , Rating: 2
Whatever, dude...


By BRB29 on 6/5/2013 11:07:29 AM , Rating: 2
quote:
Show of hands who has built a gaming-ready system on a budget without a dedicated GPU..


Gaming ready means being able to play games. You're confusing it with a real gaming system. Anybody building a real high end or mid range gaming system should not buy any of the A series apu.

The A10 had always targeted and marketed as a budget cpu+gpu package. Almost everybody that buys an A series will not buy a discrete card. That defeats the purpose of it. If you want an AMD gaming system then you would go with FX + discrete.

The A10 6800k gives you almost midrange gaming performance for just $142. You can build a whole system for general purpose + casual gaming for ~$300. It was meant to be a budget solution. A mid range graphics card would cost around $150 in comparison.

I wanted to play games on high or ultra settings. I went with intel because I could afford it. Most people wouldn't spend $600+ on a desktop for casual gaming and computing.


By BurnItDwn on 6/5/2013 11:53:55 AM , Rating: 2
I built a cheap A8 system for a young cousin of mine. It's using the integrated video. It's perfectly fine for older games or for many modern games at lower resolutions.

He mostly just plays minecraft.

Whenever I get around to upgrading my main machine, I'll upgrade my 2nd machine with the primary machine card, and then I'll give him a 4870 which should be a nice boost.


RE: Biased garbage spewing in excess
By Coldfriction on 6/4/2013 7:05:24 PM , Rating: 3
I'm glad Mick put it in there. I'm tired of all of these review sites ignoring the costs of Haswell chips in their reviews. There is only one metric to which all products can be normalized to be compared by the very nature of being a product, and that is cost to the consumer. Hard drive reviews have long had GB/$ as one of the primary points I look for. It's about time that becomes a focus in CPU/GPU/APU reviews. We should see performance/$ normalized on whichever test-bed software suite a review site chooses to use. The problem is that I get the feeling that most/all of the review sites out there are bought by their advertisers, and they'd rather not make them unhappy. I usually read forums to get legit reviews these days, and it's one of the reasons I like Dailytech so much.


RE: Biased garbage spewing in excess
By BSMonitor on 6/5/2013 11:33:57 AM , Rating: 1
Except it is a completely garbage comparison...

I.e. A full loaded Corvette, not the base model coupe, runs upwards of $65K..

Would anyone compare that price/performance to a base Mustang??

Oh look! The Mustang is almost as fast at $30K less...

Garbage, uneducated, uninformed BS


By Coldfriction on 6/5/2013 12:01:49 PM , Rating: 3
So you don't believe that reviewers should tell you what the extra $30k gets you when you buy a Corvette over a base Mustang? Does it get you faster speeds, higher acceleration, a smoother ride, better gas mileage, etc.? Sure as hell you better know performance/$ to make a purchasing decision. The entire concept of a monetary system is that you can valuate nearly anything in terms of currence so that trade is simple to understand. I don't want to buy a Corvette because some review site says it's better than a Mustang; that kind of review is the "garbage, uneducated, uniformed BS" that needs to be eliminated.

CPU and GPU performance may not be a comodity in it's truest sense, but they are similar enough for the market to measure them as such. Not all oil is the same, but it's all priced in terms of $/barrel. Same concept here. CPUs and GPUs are tested using the same software as though they are the same!

For your car analogy, we can say a race track is analogous to a software testing suite. Putting a corvette on the race track and recording the time it takes to do 50 laps, and then putting the mustang on the track and recording the time it takes to do 50 laps will give you an honest comparison. Now take the time difference and compare the prices. $30k could then be said to buy you the difference in time. This is how reality works, and it's how intelligent educated people think about money.

So if a reviewer is honest and pro-consumer they will attempt to tell their readers what difference in performance is for the difference in prices. Most reviewers leave the price analysis to the reader because their advertisers don't want them to show a truly unbiased review; they want to be favored. The first rule of business is to make the person giving you money happy whether it be a customer or an employer.


Not looking good for enthusiasts.
By someguy123 on 6/4/2013 7:18:48 PM , Rating: 2
While haswell looks like it could do pretty well in mobile if the overall system draw hits intel's estimates, it is pretty close to being worse than ivy thanks to the extra heat generated by moving the VRM on die. Even lower headroom than ivy for overclocking thanks to the temperature bump, but the performance per clock evens it out.

The bulk of the consumer market is moving towards low profile/mobile, but it's still sad to see intel gradually cutting back on CPU performance with no competition in sight. I guess I need to give up on reasonable CPU encode speeds until 2020.




RE: Not looking good for enthusiasts.
By bug77 on 6/5/2013 6:59:57 AM , Rating: 2
quote:
...but the performance per clock evens it out.


What performance per clock? Check out HardOCP, they have a clock for clock comparison. Performance per clock has barely changed over Sandy Bridge.


RE: Not looking good for enthusiasts.
By someguy123 on 6/5/2013 8:53:09 PM , Rating: 2
That's compared to SB-E, which is a different segment compared to the 4770 and has 2 more cores. It actually is faster per clock in their testing anyway with all cpus @ 4.5ghz if you look at their single thread hyper pi test.


By bug77 on 6/6/2013 5:23:59 AM , Rating: 2
I was actually looking at 2600k. Compared to that, there's a difference of about 10%. And that's across three generations. As an owner of a 2500k, I'm having a hard time justifying an upgrade.


Broadwell not Haswell
By Shadowself on 6/4/2013 6:28:38 PM , Rating: 2
I don't expect Haswell based chips to make a significant dent in ARM's control over the tablet and smartphone markets. A bit of a dent, but nothing significant.

I do expect Broadwell based chips to make a significant dent in tablets and possibly smartphones. I don't think they'll take over, but they will make a significant dent. However I really don't expect Broadwell tablet/smartphone ready chips until September/October 2014.

And if ARM does not make major leaps forward in both capability and power conservation, Skylake/Skymont based chips might take over both tablets and smartphones. But I'm not counting out the ARM guys yet.




RE: Broadwell not Haswell
By Da W on 6/4/2013 6:55:03 PM , Rating: 2
Too late. It won't significantly beat the ivy bridge in the surface pro, won't be any thinner, I don't see any reason to upgrade and those who didn't get an ivy bridge tablet probably won't get an Haskell either so it's up to windows 8.1 to make up for it. Otherwise android and arm will rule tablets too.


RE: Broadwell not Haswell
By Shadowself on 6/6/2013 8:58:21 AM , Rating: 2
No, it's not too late.

Yes, as I said above, Haswell won't make a significant dent in ARM's control, as Haswell is not a huge jump in either processing power or power efficiency. However, it would be foolish to count Intel out. It has the money, time and talent to get there with Broadwell with the feature shrink. Then there are the upcoming skylake and skymont versions. If anyone thinks those two will not be truly viable tablet and smartphone variants, they are delusional.

And note: the mid range and high end variants of these chips are not the ones that will go into tablets and smartphones. The ones that will go into these will be the low power and ultra low power variants. Plus the trend is clear, Intel is focusing more and more on this segment of the market as the mid to high end is stagnating. It wasn't that long ago that Intel debuted its chip families at near the high end with few chips above the capability of the initial release set. Now Intel debuts the mid to low end chips with the high end chips coming as much as a year later. I really don't expect the Xeon variants of Haswell to show up until 2014 (and some are saying Haswell Xeons may never happen).

And why shouldn't Intel's focus shift? Assuming Intel eventually does ship Haswell Xeons, how many will it sell in total across the entire industry? A couple million or so? Contrast that with the low end of the spectrum which might ship tens of millions to a single tablet/smartphone maker.

Windows 8.1 might make a dent, but I doubt it. Windows 8.1 when coupled with Broadwell might. But Microsoft is going to have to slim down the bloat drastically. Microsoft can "fix" its OS. Microsoft has done it before. (Microsoft has been in a "tick-tock of its own for over two decades: dumb OS, fixed/good OS, dumb OS, fixed/good OS going as far back as Windows 2.0 as a dumb OS point).


spelling nitpick
By db2460 on 6/4/2013 4:52:54 PM , Rating: 2
That whoo-whoo sound you are hearing is the grammar police pulling you over...

You probably mean "a tale of two chips" and not "a tail of two chips."




!!!!!!!!!!!!!!
By kileysmith104 on 6/9/2013 12:30:33 PM , Rating: 2
If you think Glenn`s story is really cool,, four weaks-ago my friend's brother basically brought in $6586 workin thirteen hours a week from there house and their neighbor's mother-in-law`s neighbour has done this for 4 months and made more than $6586 in there spare time at Their laptop. the advice here, Bow6.comTAKE A LOOK




"Spreading the rumors, it's very easy because the people who write about Apple want that story, and you can claim its credible because you spoke to someone at Apple." -- Investment guru Jim Cramer














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki