backtop


Print 27 comment(s) - last by fteoath64.. on Nov 8 at 6:09 AM

Samsung hopes to make up for previous load-balancing flaws with new fully independent octacore chip

Samsung Electronics Comp., Ltd. (KSC:005930) -- the world's largest smartphone maker -- designs the system-on-a-chip for its smartphones in-house, much like its rival Apple, Inc. (AAPL).  Samsung's chip line -- the Exynos -- has struggled, despite Samsung being a top leader in mobile chip manufacturing (Samsung manufactures Apple's latest and greatest A7, the brains of the iPhone 5S).

I. Exynos 5 Struggles -- is the Third Time the Charm?

The Exynos 5 Dual Core (Exynos 5250) is featured in the Nexus 10 and other products, but it showed surprisingly poor performance, getting beat by Apple's A5X and A6X dual-core chips.  Its successor -- the Exynos 5 Octa Core (Exynos 5410) -- promised to shift that, but was bogged down with cache design issues, which seriously limited the potential of its big.LITTLE core design.

The idea of big,LITTLE was to essentially devote lighter threads to smaller cores, eventually turning on bigger cores as workloads increased.  Thus the Exynos 5 Octa Core uses a mix of four lighter Cortex A7 cores, and four beefy Cortex A15 cores.
  
Samsung Galaxy S4 wide
Some were disappointed by the performance of the the Galaxy S IV

But because of weirdness with the cache coherency and how Samsung built the early versions of the chip, once the load exceeded the single threaded limit of the lean A7, it transitioned to the A15, and would exclusively use the A15 quad-core cluster until it was saturated, at which point it would use the remaining 3 A7 cores.  In other words if the A7 could hand 0.5 units of work, and the A15 can hand 1.5 units of work, you might expect a workload of three tasks with work (in units) (0.4, 0.2, 0.4) to occupy 3 A7 cores, but instead it occupies an A7 and two A15 cores
big.LITTLE
Only the least efficient big.LITTLE model -- Cluster Migration -- was supported with the initial octacore Exynos 5.

This "bug", which Samsung never officially addressed significantly impacted performance and power consumption in early units of the Galaxy S4 that shipped with the Exynos Octa Core.

Some believe this is why Samsung primarily shipped the Galaxy S4 with a third-party core -- Qualcomm Inc.'s (QCOM) Snapdragon 600 (quad-core Krait architecture).  An estimated 70 percent of Samsung smartphones shipped with Snapdragon 600s, with the remaining units being powered by the troubled Exynos octacore chips.
Exynos
The Exynos line has at times disappointed, forcing Samsung to fall back on third-party designs.

A significantly improved Exynos 5 Octa core variant (Exynos 5420) entered mass production in August, fixing much of the load balancing issues and further adding a major bump to the previous model's underpowered GPU (moving from an Imagination Technologies Group Plc.'s (LON:IMG) PowerVR SGX544MP3 (tri-core) to ARM's Mali-T628 MP6, a 6-unit design).

This second-run octacore chip was used in the Galaxy Note 3.

Samsung Galaxy Note 3
 
Samsung is rumored to have been porting some of the thread balancing improvements in the second generation Exynos 5 Octa back into the first generation model, meaning that newer Galaxy S4s may offer better power performance.

II. Going 64-Bit -- More Than Just Hollow Hype, But a Complex Issue to be Sure

And most recently Korean newspaper IT Today is reporting that development of the Exynos 6 is wrapping up.  Scheduled for production early next year -- reportedly at 14 nm -- the chip may seize the process lead from Intel Corp. (INTC), whose 22 nm LTE-equipped chips are shipping now and will pop up in a slew of products in the January-February window.

Like Intel, Samsung will reportedly use a FinFET 3D transistor design at the 14 nm node.  Samsung paired with International Business Machines Corp. (IBM) and GlobalFoundries to develop this process.  This alliance goes by the name "Common Platform Group".

IBM 14 nm wafer
IBM, GlobalFoundries and Samsung, co-developed the 14 nm FinFET process. [Image Source: PC Mag]

Like the Apple A7, the Exynos 6 will reportedly be 64-bit.  Some may be puzzled at why Samsung and Apple are jumping on this bandwagon.  From a naive approach, the main use of 64-bit registers is to address a larger amount of memory -- typically amounts over the 3 to 4 gigabytes addressable by 32-bit architectures.  Virtually no smartphone has this much memory, so 64-bit capabilities seem wasteful, from a simplistic viewpoint.

To understand why the issue is slightly more complex, you must first start by considering that memory is the primary reason to push for 64-bit adoption -- but in the server space.  

ARM, like Intel and Advanced Micro Devices, Inc. (AMD) before it, has seen the writing on the wall -- most modern server chips need to be 64-bit.  If smartphone and tablet chipmakers jump now to 64-bit (early), by the time 64-bit ARM server chips arrive in late 2014 or early 2015, there will be a growing ecosystem of A64/ARMv8 compatible apps.  This clearly benefits ARM's server ambitions.  And ARM is willing to incentivize smartphone makers to do this, by offering up a more powerful core architecture, packed with more registers to squeeze program data into.

A7 AnandTech
Going to 64-bit brought Apple some nice perks; Samsung's upcoming Exynos 6 SoC should see similar gains. [Image Source: AnandTech]

Due to the nature of the ARM instruction set, the jump from 32-bit ARMv7 (15 registers + 1 program counter) to ARMv8/A64 (31 64-bit registers + 1 program counter) 64-bit also increases the register count.  Realize that register count, of course, is not tied to the width of those registers -- it would have been possible to design a 32 register 32-bit CPU too. However, ARM is deliberately bundling these two design considerations together (as Intel and AMD) have, to further its server gains.

In addition to more registers, ARM is also packing in numerous other exclusive optimizations into its 64-bit instruction set architectures (ISA). The reasoning behind this is because ARM wants to jump into the server space and compete with Intel to power the future cloud computing market.

AnandTech's founder Anand Shimpi explores these issues in more depth in his iPhone 5S review.  But suffice it to say that jumping to 64-bit ARM architectures and instruction sets is about much more than just memory.  ARM Holdings has offered a lot of performance incentives to convince smartphone chipmakers to switch, and the tactic appears to have worked; Apple has a 64-bit chip, and 64-bit chips are approaching the market in the other top ARM chipmakers pipelines (Qualcomm, Samsung, and NVIDIA Corp. (NVDA)).

Back to the 64-bit Exynos 6, it reportedly will use a mix of Cortex-A53 and Cortex-A57 cores from ARM Holdings.  Cache issues have reportedly be fully ironed out, so the cores can be driven independently for optimal load balancing on the lightest applicable collection of cores for a given load.

ARM Cortex A53/A57
The Exynos 6, will reportedly mix A53 and A57 Cortex cores.

The first Exynos 6 chip is expected to possibly pop up in the Galaxy S5, which is expected to launch next spring.  Before that, a minor refresh of the Galaxy S4 (the rumored Galaxy S4 "Advanced") is likely to arrive, potentially bumping the onboard processor to the Exynos 5420, with its more-powerful onboard GPU.

Source: IT Today



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

28nm to 14nm in 1 year with no warning?
By blaktron on 10/29/2013 7:00:52 PM , Rating: 2
This seems really implausible, and your source article doesn't mention this. Can you provide a link to your sources? Or if they are confidential, can you provide at least that. This would be a giant, market shaking shift (as in has never happened in the history of the microprocessor) and this is the first time I have seen anyone print it.




RE: 28nm to 14nm in 1 year with no warning?
By stm1185 on 10/29/2013 7:23:44 PM , Rating: 2
Wasn't Intel supposed to have 14nm out Q4 this year, then delayed it. So it's not like they would be the only ones capable of it next year. Big accomplishment catching up to Intel, but nothing that couldn't be done.


RE: 28nm to 14nm in 1 year with no warning?
By inighthawki on 10/29/2013 7:48:17 PM , Rating: 2
I was under the impression that the delay was exclusive to Broadwell (few bugs in the chip design itself), not the 14nm transition, considering I thought Intel has already shown of 14nm Broadwell samples already?


By deathwombat on 10/30/2013 2:14:03 PM , Rating: 1
It's not about bugs in the chip, it's about yield. Intel calls it a "defect density issue". They need to increase the number of useable chips per wafer before they ramp up production.


By mik123 on 10/29/2013 8:24:16 PM , Rating: 2
The fact that Intel delayed 14nm shows it's becoming increasingly challenging to shrink the technology.
I'd be surprised if anyone else moves to 14nm next year.


RE: 28nm to 14nm in 1 year with no warning?
By Khato on 10/29/2013 8:36:52 PM , Rating: 4
Except that even if Samsung released their '14nm' process next year they wouldn't be caught up to Intel in anything but name. All indications are that if Intel went by Samsung's metric then their current 22nm process should be called 14nm and their upcoming 14nm process should be called 10nm. Basically after all the press Intel made about their process advantage with 22nm the foundries decided to re-label their future process nodes in order to appear competitive.


RE: 28nm to 14nm in 1 year with no warning?
By Mint on 10/29/2013 11:17:20 PM , Rating: 2
Ahh, that makes more sense now. I was wondering how Samsung got to 14nm all of a sudden.


By Mitch101 on 10/30/2013 10:16:36 AM , Rating: 2
Same here if Samsung leapfrogged Intel in manufacturing process that would be huge news as Intel seems to be about a solid year ahead of everyone when it comes to die shrinks.


By fteoath64 on 11/8/2013 5:16:43 AM , Rating: 2
What what I have read (mostly eetimes, digitimes, bsn) Intel's 14nm is actual their 16 nm. They mislabeled for advantage ?!. So their 22nm is actually 24nm since they were at 32nm before. So going HALF-Node will reach 24nm and going full node will get 16nm for Intel. Some engineers better do a confirmation since this is what I got from reports.

It appears 28nm HKMG for Samsung/GloFo is correct and they going half-node is like 31nm but they call it 20nm. Again stretching it for marketing ?. Samsung really has their 20nm tech for 2013. But a full node by Q2 of next year possibly ?.


By mjv.theory on 10/30/2013 5:47:13 AM , Rating: 2
Weren't Samsung first below 20nm for memory?, or was it Micron?. Either way, Samsung is probably the world leader in memory and memory process. It may not correlate perfectly to processors, but it does at least demonstrate some plausibility.

Perhaps they had decided to miss the 22nm node and instead concentrate their efforts toward going straight to 14nm for processors. It is certainly not beyond the realms of possibility. Especially if, with IBM and GF, they had decided that it was the only way to catch up to Intel. You might imagine that between them they would have the talent and resources to pull off such a strategy.


By nafhan on 10/30/2013 10:35:20 AM , Rating: 2
They'd need to be making 14nm chips NOW in order to get them into devices shipping early next year. So, I agree with you. If they were currently making 14nm chips, I think we would have heard about it. The Note 4, on the other hand... I could definitely see nextgen 14nm Exynos in the Note 4.


RE: 28nm to 14nm in 1 year with no warning?
By KurgSmash on 10/31/2013 2:34:37 AM , Rating: 2
Yeah, it's a bit beyond implausible it will outright not happen. I'd be shocked and amazed if you saw any ARM 14nm chips out before 2015.


By KurgSmash on 10/31/2013 2:35:57 AM , Rating: 2
Lol, other than ARM chips manufactured by Intel (Altera), of course.


By amanojaku on 10/29/2013 11:17:07 PM , Rating: 5
People have questioned the need to ADVERTISE 64-bit as an ADVANTAGE when the device does not support anywhere close to 4GiB. Apple's devices have 1GiB of RAM, negating the biggest advantage of going fully 64-bit: memory addressing.

The cryptographic and I/O functions could have been implemented as 64-bit (or even 128-bit) without a 64-bit ISA. ARMv7 supported 64-bit and 128-bit registers, those chips just weren't used frequently, probably due to die size and power consumption. The switch to a smaller process node and the removal of Thumb made it possible to add more transistors (in this case, double the number of registers at twice the size) in a smaller space, while achieving lower power consumption and greater performance. iPhone 5C/S benchmarks confirm this: there is an average of 10% performance increase for anything not related to cypto or I/O.

Samsung, on the other hand, is the only vendor to have a phone AND a tablet that could take full advantage of 64-bit: the Note 3 and 10.1, which have 3GiB of RAM each. When its phones come with 64-bit CPUs, it's likely the RAM will be bumped to 4GiB, as well.


By troysavary on 10/30/2013 7:43:27 AM , Rating: 2
64 bit is about more than just memory. Whether or not they are useful on phones yet is debatable. And it appears that Apple's 32 to 64 iOS transition is going far from smoothly. My point wasn't that Apple did it right. It was that Android fans who were quick to dismiss Apple when they went to 64 bits would be praising Samsung for doing the same. And, lo and behold, here you are.


By Monkey's Uncle on 10/30/2013 9:48:38 AM , Rating: 3
Dismissing the need to go to 64 bits is not really the reason people were bashing Apple. It is Apple's yelling at the top of their lungs that the 5S's 64-bit architecture would make all the world's problems disappear. That the 5S is (supposedly) sooooo much faster than it's 32 bit competition right now (regardless of Anand's benchmarks) had anyone that wasn't an apple shill dismissing these claims as nothing more than pure marketing hype. The iPhone 5s does not feel any faster than the 32-bit iPhone 5 or 5c. That is because there is nothing about the 5S that really leverages all that extended 64-bit capacity. All the apps they are running are 32-bit apps and any speed improvement in 32-bit apps is marginal.

Moving to 64 bits was a positioning move for Apple and most of us know it. There is nothing in the Apple store that will actually leverage that 64-bit capability. The same will apply to Samsung's Exynos, Qualcomm's Snapdragons and Nvidia's Tegras. At this point it is positioning. There will be nothing in the Google Play Store or Windows store that leverages 64-bits when those are released either. But once the platform is actually out there with a healthy user base expect to see the 64-bit apps starting to trickle in.

But make no mistake here. 64-bit is coming whether there are immediate gains in doing so or not. I am not praising or slamming any company for doing so because 10 years from now we will expecting more and more out of our mobile devices. Moving to 64 bits in the hardware world gets us positioned today for the software and advanced systems we will be seeing tomorrow. But moving to that platform is a necessity for these guys if they intend to continue to compete in a market that is continually evolving.


By fteoath64 on 11/8/2013 6:09:56 AM , Rating: 2
Good points. But look at the "speed" evolution here that is expected by the market and users. Granted, the manufacturing process is getting better into 20nm then 14nm into 10nm. The power budget for battery is not increase any more, so power management is key to having good battery life. With A15 pushing close to 2Ghz is getting into the power-hungry band yet not delivering as much performance as wanted. Going 64-bit seems a better way as one gets 25% (thereabouts) in architectural change in the chip itself running 32bit code. Faster ram and wider bus helps and the A57 has a killer FPU and faster integer units. So it is a better solution than pushing A15 another notch or two.
Qualcomm did very well pushing their S800 Kriat cores to 2.26Ghz while maintaining the same power budget as the standard A15 chip. That was their internal optimization of the logic that is their own design and no one else has it. So did Apple A7 but in 64bit guise. New technology sells especially when it delivers the performance expected.


By Reclaimer77 on 10/30/2013 1:10:58 PM , Rating: 3
You're trolling in other words. Putting everyone In two convenient apposing camps, in order to manufacture conflict.

64 bit is more than memory, yes. However memory is the ultimate road block. Samsung already has devices with 3 gigs, they will HAVE to go 64 bit to address next years 4 gig models.

What's the big controversy here? You're being silly.


By Jeffk464 on 10/31/2013 3:46:44 PM , Rating: 2
Phones are going to have to make the switch to 64bit eventually, so why procrastinate?


By GTVic on 11/3/2013 8:38:18 PM , Rating: 2
Do the individual apps need to access more than 3GB each on a phone? AFAIK there are no phones that even have more than 3GB of RAM.

Depending on how the OS allocates RAM to the apps, it should be feasible to avoid a 64-bit OS or processor and still support more than 3GB of total system RAM. In that case the limitation would be on the available RAM per app. That limitation is not likely to affect phone apps in the foreseeable future.


64bit..
By zodiacfml on 10/30/2013 2:42:05 AM , Rating: 2
thanks for that insight......after reading many articles why 64bit is overkill on devices.




RE: 64bit..
By mjv.theory on 10/30/2013 5:53:12 AM , Rating: 3
Agreed. The move to 64-bit in phones and tablets is partly long-term strategy, partly linked to server plans and partly advantages of A57/A53 over A15/A7. There is no mystery to the general and specific motivations of the present trend to switch to 64-bit processors in mobile.


RE: 64bit..
By haukionkannel on 10/30/2013 2:36:03 PM , Rating: 2
Indeed. When they go to 64 bit now, there will be 64 bit programs in few years and in few years 4Gb or even more in phones or tablets are more than likely to happen. And as it has been said, servers needed 64bit some time ago...
So this is for the future.


RE: 64bit..
By boeush on 10/30/2013 1:45:45 PM , Rating: 4
The day is coming -- probably sooner than most would suspect -- when 4 GiB of RAM, even on a phone, will seem laughably tiny.

Remember, once upon a time 640 KiB seemed plentiful...

It's basically a dynamic of "if you build it, they will come": as soon as excess hardware resources are made available, new applications will arise to fully consume them (and/or, developers can stop worrying about optimizing for size/performance, and instead start focusing on quantity/quality of content, higher-quality design/code, and shorter delivery times...)


What could they possibly put in a 2014 phone?
By flyingpants1 on 10/30/2013 8:23:30 PM , Rating: 2
What would make it worth it for you to upgrade in 2014? We already have every specification nearly maxed out, we are way past the point of diminishing returns.

If Samsung's Galaxy S4 with Snapdragon 600 can still feel laggy due to their awful Touchwiz skin, that should be a clue that specs don't matter, it's the user experience that matters. I have a 1GHz Windows 7 phone that doesn't feel laggy.

70mm wide (Nexus 4, 5, Galaxy S3, S4, LG G2) is really just about the limit for one-handed use.

Anything greater than 1080p and you are just increasing power draw and cost for no reason.

I never use LTE, I just don't need 40mbit on my phone with a 2GB data cap. No need whatsoever for LTE-Advanced.

Cameras are fine, most people cannot tell the difference from a picture taken with a 5MP iPhone 4 camera.

The only specs left to increase are battery size , speaker quality (HTC One's front speakers), removable storage (microSD card).




By vision33r on 10/31/2013 5:51:18 PM , Rating: 3
Samsung needs to make a top quality small smartphone. I don't think you can go past 5.9" screen size and call it a phone. And for processors 4 is enough before you run into all sorts of problems with the cpu clusters and too much memory will eat power and heat.

Make things smaller again and work more efficiently.


"If a man really wants to make a million dollars, the best way would be to start his own religion." -- Scientology founder L. Ron. Hubbard














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki