backtop


Print 49 comment(s) - last by BillyBatson.. on Mar 13 at 7:10 PM

New 28 nm chip is more than twice as powerful as anything on the market

Qualcomm, Inc. (QCOM) impressed at the 2012 Consumer Electronic Show in January.  At the show it talked about its transition down to the 28 nm node (via Taiwan Semiconductor Manufacturing Comp., Ltd.'s (TPE:2330new processes) and pushing Snapdragon 4 (S4) system-on-a-chip designs onto the market.

Developers' first taste of S4 will be delivered via the MSM8960 Mobile Development Platform (MDP), which will be distributed at the 2012 Mobile World Congress, which starts Feb. 27 in Barcelona, Spain.  But AnandTech scored an early unit and has been busy benchmarking it.

How does the fourth generation Snapdragon fare? To quote AnandTech:

The [CPU] performance advantage... is insane.

The super-chip's performance comes, in part, thanks to the new Krait core, which is built on a licensed ARMv7 instruction set from ARM Holdings plc. (LON:ARM).  The new core is similar, in some ways, to the ARM Cortex-A15 intellectual property core from ARM Holdings.

The configuration tested by AnandTech was the MSM 8960 MDP, a dual-core 1.5 GHz design.  Quad-core and single-core variants will also be available.  The cores onboard pack twice the L2 cache (1 MB) of their predecessor, a deeper 11-stage pipeline a 50% wider instruction decoder (3-wide), and twice the execution ports (7).  The cores support out-of-order execution and 128-bit NEON instructions.

In certain math-heavy (e.g. Linpack) or cache-heavy benchmarks, the S4 more than doubles the scores of its closest competitors -- the Motorola Droid RAZR (1.2 GHz dual-core OMAP4430 from Texas Instruments, Inc. (TXN)) and the Galaxy S II (also OMAP4430), and the Galaxy Nexus (1.2 GHz dual-core OMAP4460).  In browser benchmarks, the new S4 chip is 20-35 percent faster.

AnandTech summarizes:

These results as a whole simply quantify what we've felt during our use of the MSM8960 MDP: this is the absolute smoothest we've ever seen Ice Cream Sandwich run.

The on-die Adreno 225 GPU also receives a nice bump.  While it does not dominate like the CPU, the transition to 28 nm allowed Qualcomm to squeeze Direct3D 9_3 and bump the clock speed to 400 MHz, up substantially from the 266 MHz Adreno 220.

The improvements allow the Adreno 225 to trade blows with the PowerVR SGX543 MP2 (Imagination Technologies Group plc. (LON:IMG)) in Apple, Inc.'s (AAPL) iPhone 4S and ARM Holdings' Mali 400 in the Galaxy S II.  In both cases, the Adreno 225 proves its mettle, offering performance that can match or beat the competition in some cases.

In its power benchmarks, AnandTech used the platform's built in power-measuring tools, the Trepn Profiler software.  The results showed that in periods of modest activity (e.g. web browsing) a single Krait core was used, with the second core and GPU only occasionally becoming activity.  Power performance hovered around 400-750 milliwatts, with a peak consumption of about 750 milliwatts per core.  Idle periods showed about 150 milliwatts.

Qualcomm
The Snapdragon S4 looks to be the most powerful smartphone CPU when it launches later this year in handsets. [Image Source: Jason Mick/DailyTech]

During demanding GPU activity like games, the platform could suck down 800-1200 milliwatts.

The average Android cell phone has been a 1600 and 2000 milliwatt-hour battery, so this indicates that the phone is good for about 3-4 hours of web browsing and 1.5-2 hours of gaming.  Again, this lower battery life is thanks to the development platform's larger 4-inch display, much as the iPhone's strong battery life is thanks to its small 3.5-inch display.

The development model was running Android 4.0.3.

Overall, the S4 is likely to see heavy pickup in Windows Phone models.  It also could potentially win business from Samsung Electronics Comp., Ltd. (KS:005930), who experimented with the S3 in some of its Galaxy S II models.

Source: Anandtech



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Useless unless...
By DEVGRU on 2/21/2012 4:22:40 PM , Rating: 1
Hey that's great. Sounds really cool.

Unfortunately, all that performance means nothing when the battery dies. If anyone has a relatively new smartphone they know all the crap thats 'bundled' on the phone at the factory, you know how fast all those worthless apps and features running in the background will have you wondering why your just not carrying around a netbook with a headset for Skype.

Some unwanted stuff can be turned off, some can't. Whats the use of an uber-fast CPU/GPU on a phone if the providers just keep filing the phones with battery-sucking adware?

Sure, we as power users can disable some stuff and tweak things a bit (thank God for the likes of Cyanogen), what about Joe User that just wants a cool phone that works when you need it to?




RE: Useless unless...
By drycrust3 on 2/21/2012 4:42:29 PM , Rating: 3
quote:
just keep filing the phones with battery-sucking adware?

Totally agree. The phone should be capable of lasting at least 12 hours between charges when it is purchased. But no, you have to go through the phone and weed out uninvited constant email inbox checks (which also wastes your data allowance) and lots of other nonsense before it will last 12 hours.


RE: Useless unless...
By ChronoReverse on 2/21/2012 5:14:12 PM , Rating: 2
Eh, I have Push email turned on and my phone easily lasts over 12 hours.


RE: Useless unless...
By kmmatney on 2/22/12, Rating: -1
RE: Useless unless...
By ChronoReverse on 2/22/2012 1:47:36 PM , Rating: 1
Well, I don't get 3 days but I do get 2 days under normal use with my Android phone.

Still, the point was that most smartphones usually get one day at least.


RE: Useless unless...
By theapparition on 2/22/2012 9:38:54 AM , Rating: 2
But despite these preliminary tests, that's one area where Qualcomm really concentrated their efforts and made vast improvements.

Currently, Cortex A8 and A15 designs have dual cores that are not independent. But Qualcomm's Krait architecture enables independently clocked cores. Which means that one or more cores can be completely shut off, saving lots of power. Also, individual cores can be ramped up when needed, so again saving power.

I don't know what experience everyone else has with phones, but Galaxy Nexus will last all of 24hrs with about 3hrs screen time.


RE: Useless unless...
By subflava on 2/21/12, Rating: -1
RE: Useless unless...
By tayb on 2/21/2012 5:07:47 PM , Rating: 3
I was under the impression that applications that are "running" in Android are only consuming tedious amounts of ram and are not utilizing any of the CPU until commanded to open. This article,

http://geekfor.me/faq/you-shouldnt-be-using-a-task...

mentions that actively trying to kill those apps is actually wasting battery by making Android kill them and completely re-open them.

I agree with your main point though. A phone that gets 3-4 hours of battery life on web browsing is worthless to me.


RE: Useless unless...
By BioHazardous on 2/21/2012 5:11:00 PM , Rating: 2
All of this speculation is great.. Except the phone's battery life hasn't been tested and the analysis done here with:
quote:
The average Android cell phone has been a 1600 and 2000 milliwatt-hour battery,

is just wrong. Most smart phones of this nature have a 5+Whr battery. The iPhone has a 5.3Whr battery and the development platform for the Qualcomm S4 has a 5.6Whr. Feel free to redo the math with those numbers for battery size.


RE: Useless unless...
By wordsworm on 2/22/2012 12:58:51 AM , Rating: 2
Can you explain to me what your comment on the iPhone has to do with the average power ratings on batteries associated with Android devices?


RE: Useless unless...
By messele on 2/22/2012 1:36:07 AM , Rating: 1
As per usual Mick couldn't resist a feeble attempt at bashing the iPhone right there in the 'article' so it's already been brought into the discussion...


RE: Useless unless...
By sprockkets on 2/22/2012 5:45:25 PM , Rating: 2
Feeble attempt? He's just stating facts about how the battery life is good on it and comparing the GPU in it to the new Adreno 225. Kinda hard to not compare it since it is the top GPU right now in phones.

Only a apple fanboi sees this as "bashing".


RE: Useless unless...
By BioHazardous on 2/22/2012 8:55:01 AM , Rating: 2
quote:
Can you explain to me what your comment on the iPhone has to do with the average power ratings on batteries associated with Android devices?

Sure, we're talking about battery life and the article mentions the iPhone and as a point of reference it seems necessary to know that the iPhone has a 5.3Whr battery and the development platform for the S4 has a 5.6Whr battery.

Now that I've explained myself for using the sacred word 'iPhone'.. The point was that the math for the estimates on battery life were horribly inaccurate to say at most the batteries that go into smart phones are 2Whr. I obviously proved that by pointing out that the iPhone has a 5.3Whr and the S4 dev platform has a 5.6Whr. When you do the math, that's between 11 and 14 hours of battery life browsing the web. Are we all satisfied now that it's not as god awful as everybody jumped to believe after the analysis done in the article? Ok thanks, and I'm terribly sorry for mentioning the iPhone as a point of reference.


RE: Useless unless...
By BillyBatson on 3/13/2012 7:10:20 PM , Rating: 1
lol bio definitely +1 hilarious
And I'm anti-Apple, but pro-iPhone, and window8 mobile hopeful


RE: Useless unless...
By naiche on 2/22/2012 2:37:58 PM , Rating: 2
Yes, the article is seriously wrong, just to give an exemple, the Nexus One wich is old and doesn't have a very large battery uses a 1400mAh and 3.7V one, since P=V.I, we have:

P = 1.4 * 3.7 = 5.18Whr


RE: Useless unless...
By bug77 on 2/21/2012 5:51:57 PM , Rating: 3
quote:
If anyone has a relatively new smartphone they know all the crap thats 'bundled' on the phone at the factory, you know how fast all those worthless apps and features running in the background will have you wondering why your just not carrying around a netbook with a headset for Skype.

Some unwanted stuff can be turned off, some can't.


Umm, you do know that ICS is supposed to let you turn off anything you don use, don't you? Of course, not many users have ICS right now, but that will change in a few months.


RE: Useless unless...
By Alexvrb on 2/21/2012 7:34:16 PM , Rating: 2
Even on a non-vanilla build? Will you be able to remove all the OEMware and Carrierware? Or just for "pure Google" devices?


RE: Useless unless...
By ChronoReverse on 2/21/2012 7:51:52 PM , Rating: 2
You can disable anything. It was made a big deal of in every single ICS I've seen on major sites.


RE: Useless unless...
By tayb on 2/21/2012 8:32:54 PM , Rating: 2
I'm not sure how that would be possible unless Google isn't allowing software customization anymore. On "Google Experience" devices you get what Android is supposed to be but with everything else you get crapware that can't be removed.

My Droid X for example has a handful of apps that cannot be removed easily. And then if I were to remove them (which I unfortunately did once) you are no longer able to receive software updates as the update checks your phone for those deleted apps. Getting those apps back was a joy. That's all thanks to the carrier, not Google. I have a hard time believing Google has fixed that nonsense unless they've made Android closed-source software.


RE: Useless unless...
By sprockkets on 2/21/2012 9:16:32 PM , Rating: 2
They allow software customization, but you can disable the programs, since you can't really take them off the ROM.

Disabling say a skin like sense is a different story. AT this point however, sense on ICS on my HTC sensation is in better working order.


OMAP?
By protosv on 2/22/2012 9:05:27 AM , Rating: 2
Perhaps I'm a bit confused, but I thought the OMAP series was Texas Instruments' implementation of the Cortex A9. If so, how is an OMAP running in an SGS II? I thought they use an Exynos?




RE: OMAP?
By theapparition on 2/22/2012 9:44:13 AM , Rating: 2
You're not confused. The SGSII primarily uses the Exynos 4410. The only exception is the TMobile variant uses a Snapdragon. But no SGS or SGSII uses an OMAP.

Only the Galaxy Nexus uses an OMAP.


RE: OMAP?
By LordSojar on 2/22/2012 10:00:29 AM , Rating: 2
quote:
But no SGS or SGSII uses an OMAP.


My Galaxy SII LTE uses an OMAP Snapdragon S3 @ 1.5GHz... just sayin. This is on AT&T btw... and it's an absolutely amazing phone.


RE: OMAP?
By theapparition on 2/22/2012 10:11:57 AM , Rating: 2
There's no such thing as an OMAP Snapdragon S3.

OMAP = Texas Instruments
Snapdragon = Qualcomm

The AT&T variant you have uses the S3 chip from Qualcomm, not the TI OMAP.


impressive
By Soulkeeper on 2/22/12, Rating: 0
RE: impressive
By B3an on 2/22/2012 12:02:04 PM , Rating: 2
It wont compare to Sandy Bridge clock for clock thats for sure. But it will easily compare to Atom and similar chips from AMD.


RE: impressive
By Soulkeeper on 2/23/2012 12:43:23 AM , Rating: 2
well, yeah ...


RE: impressive
By Soulkeeper on 2/23/2012 12:45:48 AM , Rating: 2
I wasn't wrong in any way.
Funny how the mindless drones rated my post down.


Area
By omnicronx on 2/22/2012 2:32:53 PM , Rating: 2
quote:
Again, this lower battery life is thanks to the development platform's larger 4-inch display, much as the iPhone's strong battery life is thanks to its small 3.5-inch display.
Just wondering how you came to this conclusion?

We are talking about 12.5% difference in surface area between a 5:4 3.5" iPhone, and a 16:9 4" Android device.

Would the devices resolution also play a big part in battery consumption? With a 4" device I am also going to assume its still a 854*480 display, so would an Android device of this size not also have less pixels to push than the 960*680 iPhone display?

I just really have my doubts that Apple's strong battery life is due to the smaller screen. Its quite apparent in Android 4 that Google has done a lot work to help optimize certain hardware for the OS to help improve on battery life, which is something Apple has long excelled it. (and is much easier when you are vertically integrated and control most of the hardware being used in your devices)




RE: Area
By omnicronx on 2/22/2012 2:39:01 PM , Rating: 2
woops.. I keep forgetting the iPhone is 2:3 ish.. So closer to 17%..


Impressive
By TakinYourPoints on 2/21/2012 5:34:13 PM , Rating: 2
Most impressive, really awesome benchmarks there




great!!
By cokbun on 2/21/2012 9:24:42 PM , Rating: 2
now i can browse 30% faster !!!!




GPU playing catchup
By augiem on 2/22/2012 2:54:08 AM , Rating: 2
Nice to see the Adreno finally can match the SGX543MP, but sadly that's almost a year old already. It's a little irritating that Apple seems to be the only one out there who pairs a top notch CPU with a top notch GPU. The PS Vita is an exception, but it's not a phone/tablet.




Battery life is king
By samfms on 2/22/2012 10:55:19 AM , Rating: 2
Great chip....now under-clock the chip to make it consume less power and give me a phone that can last at-least two days under average use....otherwise i'm not interested!

who is with me?




By fteoath64 on 2/23/2012 7:29:00 AM , Rating: 2
Great that it turns out as expected for Qualcomm. This chip will be a new standard for phones and tablets for this year and I hope they turn out tens of millions of units. This is what ICS needed and the industry needs a new boost after the success of Tegra2 last year. Great to see the competition with the A15 as the GPU of choice here is also very respectable,so it is a great balance.

As to battery life, it will have to be a design factor and a software factor to optimise for it. Qualcomm is certainly hoping to top their 1Ghz SnapDragon chip which did well. I think this chip will double its volume very easily since it is out-of-the-gate this quick.
Well done Qualcomm!.




Always the idiot comment
By BSMonitor on 2/21/12, Rating: -1
RE: Always the idiot comment
By messele on 2/21/12, Rating: -1
RE: Always the idiot comment
By Kwiker on 2/21/2012 5:08:09 PM , Rating: 5
3.5" display = 1.94"x2.92" @1:1.5 = 5.66"sq
4" display = 2.22"x3.33" @1:1.5 = 7.39"sq

that .5" accounts for a 23.4% increase in the amount of surface area that needs to be illuminated.

I don't use either platform, I just like correct math :)


RE: Always the idiot comment
By messele on 2/21/12, Rating: -1
RE: Always the idiot comment
By BSMonitor on 2/23/2012 2:28:24 PM , Rating: 1
quote:
BSMonitor sucks at mathematics but somehow managed to actually gain 2 points in his rating in the time that Kwiker (top of the class) pointed that out. Mick will be happy that new improved 'droidagra makes you 23.4% less flaccid.


7/6 is ~16%. The math was correct. Donkey.

It's called an estimate of the dimensions. 16%, 25%, 35%.. None of them is close to saying Large vs. Small.

You my friend are a moron.


RE: Always the idiot comment
By Jeremy87 on 2/21/2012 6:47:53 PM , Rating: 3
7.39/5.66 = 30.6% increase, not 23.4.

I just like correcting corrected math.


RE: Always the idiot comment
By Solice55 on 2/22/2012 10:19:41 AM , Rating: 2
5.66/7.39 = 0.7659, or 76.6%

If 7.39 is 100%, then this would be a change of 23.4% (100 - 76.6).


RE: Always the idiot comment
By GmTrix on 2/22/2012 11:39:15 AM , Rating: 2
7.39/5.66 = 1.305 or (an increase of 30.5%)

5.66/7.39 = 0.765 or (a decrease of 76.5%)

It's not accurate to say that (100 - 76.5)% is the percent increase because that 76.5% is a percent of 7.39 not 5.66.

To make it clear:

"What percentage do you times by 5.66 to get 7.39?", i.e.
5.66*increase% = 7.39 ==> 1.305 ==> 30.5% increase

"What percentage do you times by 7.39 to get 5.66?", i.e.
7.39*decrease% = 5.66 ==> 0.765 ==> 76.5% decrease


RE: Always the idiot comment
By theapparition on 2/22/2012 12:02:22 PM , Rating: 2
Not to nitpick, as I agree with most everything you have listed, but when talking about decrease, it's 1-the percentage.

So it would be more correct to say it's 76.5% of the brightness or a 23.5% decrease.

76.5% decrease implies that something is only 23.5% as much.


RE: Always the idiot comment
By GmTrix on 2/22/2012 12:53:47 PM , Rating: 2
Yep you're right. My bad.


RE: Always the idiot comment
By theapparition on 2/22/2012 10:08:08 AM , Rating: 3
Need to correct some things a bit:
You correctly listed Apple screen as a 1:1.5 ratio.

3.5" display @ 1:1.5 = 1.94"x2.92" = 5.66"sq

But all 4" displays are not 1:1.5 since they are on Android devices, the aspect ratio is closer to 1.6667 (480x800), or qHD and HD are 1.7777 (960x540 or 1280x720).

So replacing some numbers with real life android equivalents on some models.
4.0" display @ 1:1.6667 = 2.06"x3.43" = 7.07"sq
4.3" display @ 1:1.7778 = 2.11"x3.75" = 7.91"sq
4.65"display @ 1:1.7778 = 2.28"x4.05" = 9.24"sq

So that would be 24.9%, 39.8% or a whopping 63.3% increase of lighted area.

Considering that a full 720 HD screen is also pushing 50% more pixels than the iPhone 4/4S, then you can see where some extra power is going.


RE: Always the idiot comment
By Kwiker on 2/22/2012 10:45:13 AM , Rating: 3
very nice :) sorry I was too lazy to dig up exact ratios. I was mostly just trying to point out that the .5" was far less of a "trivial" increase as was being claimed.


RE: Always the idiot comment
By BSMonitor on 2/23/2012 2:25:09 PM , Rating: 1
Donkey. Missed point entirely.

Large compared to small is not 25%. Grow up


RE: Always the idiot comment
By BSMonitor on 2/23/2012 2:22:55 PM , Rating: 1
Well, it's called quick estimating. I was off 7%. I was not lame enough to look up the exact dimensions down to the 100ths of an inch.

7/6 = 16% more. Grow up donkey. Math was correct.

23% or 16% my point is the same.

If Large is 1.25 and Small is 1.00. Large is then defined as pathetic. As the difference doesn't even make up a quarter of small.

Or, the terms do not apply correctly. Hence JMicks BS use of the English language.


"Paying an extra $500 for a computer in this environment -- same piece of hardware -- paying $500 more to get a logo on it? I think that's a more challenging proposition for the average person than it used to be." -- Steve Ballmer














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki