backtop


Print 21 comment(s) - last by vision33r.. on Mar 9 at 4:34 PM

K1 is impressive in AnTuTu benchmark

During CES 2014, NVIDIA was showing off a new processor that fits into the Tegra line: the Tegra K1. This week, some benchmarks that are claimed to be from that chip have hit the web.
 
We already know most of the hardware specs on the K1, including the fact that the chip has 192 CUDA cores. The chip is a 32-bit quad-core unit with another version tipped that will have dual Denver 64-bit cores operating at up to 2.5GHz.
 
The K1 has a maximum clock speed of 2.3GHz (in quad-core configuration) and supports DDR3L and LPDDR3 memory up to 8GB. The chip will also support displays with resolution up to 3840 x 2160.
 
Benchmarks for the chip show what it was able to score on the AnTuTu running Android 4.4.2. The unknown device that was used to run the benchmark has a screen with 1920x1080 resolution and 2GB of RAM.
 
You can see the full results of the benchmarks in the images below:
 



[Images courtesy MyDrivers]
 
The Tegra K1 should make for some impressive smartphones and tablets when they start coming to market; but NVIDIA first has to get on the ball with design wins.

Sources: MyDrivers, Neowin



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Blah. Nvidia pre-release benchmarks mean zero
By retrospooty on 3/6/2014 9:33:59 AM , Rating: 5
When its released and in a real product without heat or battery issues, and its independently tested by trusted review sites, you can believe it, and not a second before.




RE: Blah. Nvidia pre-release benchmarks mean zero
By TSS on 3/6/2014 10:30:50 AM , Rating: 2
Not entirely. I'll belive the performance to be in the area of where they claim it'll be. Both the tegra 3 and 4 have had good performance for their generations, i don't see why this time should be different.

They also use alot of power and generate alot of heat. Considering those benchmarks are absent as you noted, it's not a stretch to think they're absent because this chip also has a ludicrous powerdraw. If it where different Nvidia would be the first ones to point that out.

All in all, just the next version of tegra. Nothing too exciting.


By retrospooty on 3/6/2014 1:55:37 PM , Rating: 2
I know... That is why I am saying wait until its in a real released product and independently reviewed and shown by trusted 3rd review sites to not have heat or battery drain issues. In short, Nvidia is full of crap.


By CaedenV on 3/6/2014 1:58:54 PM , Rating: 2
Even if K1 fixes the power and heat issues (which is supposed to be a major focus), the fact is that nVidia always asks too much for their mobile chips, so very few devices will implement them.


By ritualm on 3/6/2014 2:41:43 PM , Rating: 2
quote:
I'll belive...

Show me actual product or go home. Benchmarks are nothing when the hardware itself sits next to a bin called vaporware.


By Da W on 3/6/2014 2:56:43 PM , Rating: 2
Exited if they do a Shield 2 with it. Else i don't care.


RE: Blah. Nvidia pre-release benchmarks mean zero
By name99 on 3/6/2014 4:13:07 PM , Rating: 4
No-one gives a fsck about absolute performance. You want AWESOME graphics performance --- buy an AMD FirePro and feed it 500W.

The issue is ALWAYS whether the performance comes at an acceptable power usage or not. The rumors are that this thing uses power in the tens of watts range to get these sorts of numbers. That sort of power usage makes it useless for phones, useless for tablets, useless for many (not all) laptops.

If the target markets are now (bulky) laptops and desktops, let's see it compared against chips that compete in that market. Seeing it compared against phone GPUS is as useless as telling me that a 4GHz i7 runs a lot faster than an A57. True, but irrelevant to anything.


By Morawka on 3/8/2014 4:22:49 AM , Rating: 2
They've already showed the K1 running on less than 1W (measured in real time with voltmeter) on stage in real-time running the Unreal 4 Tech Demo, i dunno what else you want.


Excellent
By bug77 on 3/6/2014 11:59:47 AM , Rating: 2
This shows just how wasteful quad-core+ chips are today for mobile devices.
Higher clocked dual chips seem to have no issue keeping pace. Couple that with the fact that not even demanding mobile games use more than two cores (arstechnica did an article on this) and a dual-core K1 starts looking like a real smart choice.

*The above is assuming the dual-core version will not use more power than the quad core one. This is usually the case, but until we see actual results we don't know for sure.




RE: Excellent
By Argon18 on 3/6/2014 12:43:40 PM , Rating: 2
That's complete nonsense. Phones need one full core for the real-time tasks of monitoring the cellular network. It's why you see substantial game performance boost from going to dual-core from single core on a phone, even for games that are single-threaded.

Not to mention that you've likely got other apps running in the background checking your email, getting facebook updates, etc.

Just because a *game* doesn't utilize four cores, doesn't mean the *phone* can't fully utilize them. (Remember that for mobile phone game benchmarks, they turn the radio off and they halt all other background apps).


RE: Excellent
By bug77 on 3/6/2014 3:12:11 PM , Rating: 3
quote:
Phones need one full core for the real-time tasks of monitoring the cellular network.


If that was true, I wouldn't be able to do anything, but basic phone calls on my single core phone. Strangely enough, I can so a hell of a lot more than that..

I hope you do know a core can run any number of threads without a hitch as long the threads combined don't need more than 100% of a core's power. If you have two threads using 10% computing power each, they will happily run on a single core.


RE: Excellent
By name99 on 3/6/14, Rating: 0
RE: Excellent
By retrospooty on 3/6/2014 12:50:24 PM , Rating: 2
But it's totally free. When not in use cores power off entirely. When needed, the power is there.


RE: Excellent
By bug77 on 3/6/2014 3:15:09 PM , Rating: 2
Hopefully, if the software works flawlessly.
However, the trade-off is that a dual-core can be clocked higher, so when your CPU intensive stuff only uses a core or two, a quad can still hold it back. Mail, FB or taking photos will not do that, so this is mostly about games, I think.


RE: Excellent
By CaedenV on 3/6/2014 2:07:43 PM , Rating: 2
While I generally agree with you, there are some odd exceptions when it comes to mobile devices. On a PC almost every individual part has their own processor, while on a phone a lot of peripheral processing is done on the CPU. Phones also tend to be doing more and more video and photo work which is much easier (and efficient on the battery) to run multi-core than games and software. Also, background programs and processes can take quite a few CPU cycles, and a duel core game will run much faster if it can offload those processes to another core.

So while a quad core may well be overkill, there are plenty of reasons to have more than 2 cores. It would be really interesting to see a design with one small core to manage the phone peripherals and background processes, and then 2 normal sized cores to take care of active tasks... but I don't think that will be happening any time soon.

Between K1, 805, and the new Atom processors coming down the pipe, it looks like we will see some nice improvements in next gen phones just in time for my contract to expire! Things are lining up beautifully!


RE: Excellent
By bug77 on 3/6/2014 3:21:08 PM , Rating: 2
quote:
Between K1, 805, and the new Atom processors coming down the pipe, it looks like we will see some nice improvements in next gen phones just in time for my contract to expire! Things are lining up beautifully!


The only improvement I'm looking forward to is battery life (hence my bias towards dual-core solutions). Unfortunately, with the advent of the smartphone, it has taken a plunge to about one day and it remained around that mark ever since. O know processing power has improved dramatically, buy I'd like to see a decent phone that will last me for a week or so. Not everybody needs the absolute fastest, just like not everybody buys exclusively intel's extreme edition CPUs.


RE: Excellent
By purerice on 3/6/2014 4:13:32 PM , Rating: 2
If I may also generally agree with you, if only to add a caveat to your "overkill" statement, each core can in theory run at an independent speed.

Also, power requirements scale higher than speed increase.

So a single core CPU running 1 major task and several smaller tasks at 2.2ghz will use significantly more power than a quad core CPU with one core at 1ghz and 3 cores each at 400mhz.

When it comes to battery life, spreading as many tasks around as many low speed "processors" as you can is the way to go.

8 or 16 cores simple cores would be even better for 90% of most usage, as far as battery life is concerned.


RE: Excellent
By bug77 on 3/7/2014 9:07:27 AM , Rating: 2
Frequency vs power is much, much more complicated. I'll try to explain it in brief when I'll get some time on my hands.


By TheJian on 3/6/2014 2:48:47 PM , Rating: 2
Not quite sure why it's showing 3ghz when the Denver 2 core version is supposed to be 2.5ghz. Considering the die shots had the 2 cores about 2x the size of the quad's cores, I expect them to be roughly the same speeds (2.5ghz vs. A15 quad at 2.3ghz). I wasn't expecting 3ghz, so I'm confused.

Anyone have any data on this?




Flip Phone Battery Life?
By jmunjr on 3/7/2014 3:15:59 AM , Rating: 2
I'll be impressed when any phone can match the battery life of my old flip phone which easily could be on standby for 4+ days if no calls and usually lasted over two with mild usage.

I wish there was a smartphone that could mimic that performance even if it means turning everything off except the phone and text. Sadly even with extended batteries few Android phones can last most than 36 hours with mild use(much less with regular batteries).

I would buy a phone with fewer features and less performance that had killer battery life...




Nvidia K1
By vision33r on 3/9/2014 4:34:20 PM , Rating: 2
This benchmark is silly, all other GPU makers are using silicon already tested in a phone that won't melt compared with the test Nvidia leaked out probably done on a PCB strapped with a heatsink fan.




"The whole principle [of censorship] is wrong. It's like demanding that grown men live on skim milk because the baby can't have steak." -- Robert Heinlein














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki