Print 21 comment(s) - last by TheJian.. on Nov 13 at 7:00 AM

NVIDIA unleashes its metaphorical can of "whoop ass"

There was a time -- back in 2010 -- when NVIDIA Corp. (NVDA) looked a bit lost.  A resurgent Radeon brand, now owned by Advanced Micro Devices, Inc. (AMD), was punishing it in the discrete graphics market with the Radeon HD 5000 series devices, and analysts were scratching their heads in puzzlement at NVIDIA's focus on GPU computing.  

And NVIDIA's Tegra system-on-a-chip effort was largely written off, as NVIDIA couldn't seem to figure out what it wanted to do with it -- netbooks? Mobile devices?  No one could quite tell.

I. A Young Power in the Mobile Market

Fast-forward three years and NVIDIA is in a far different -- and far better -- position.  GPU computing is an exploding field and NVIDIA has large purchase orders from the hottest new deployments.  It's back to scoring wins in the gaming graphics market.

And most importantly Tegra has exploded, seizing a commanding stake in the mobile device system-on-a-chip (SoC) market.

In calendar/fiscal Q3 2012, NVIDIA posted large earnings and revenue surprises, delivered thanks to its SoC and GPU market gains.  Revenue fell in at $1.204B USD (GAAP), pleasantly above the consensus of $1.192B USD estimated by 33 analysts surveyed by The Financial Times.

NVIDIA Tegra chip
Tegra 3 drove NVIDIA to a big profit surprise.

But much like system-on-a-chip archrival Qualcomm, Inc. (QCOM), the biggest surprise lay not on the revenue, but on the net income (profit) front.  NVIDIA pocketed a whopping $209M USD, ($0.33 USD/share), above the most optimistic estimate of $203M USD ($0.32 USD/share) from the analyst crowd, and even higher above the average estimate of $187M USD.

NVIDIA's at times colorful and divisive chief executive officer and president, Jen Hsun Huang crowed, "Investments in our new growth strategies paid off this quarter in record revenues and margins.  Kepler GPUs are winning across the special-purpose PC markets we serve, from gaming to design to supercomputing. And Tegra is powering some of the most innovative tablets, phones and cars in the market."

The chipmaker decided to share the wealth with its shareholders, offering up a 7.5 cent dividend.

Capital expenditures for NVIDIA have grown as the company sharpens its focus on bleeding edge system-on-a-chip research.  NVIDIA estimates that it will spend $50M to $60M USD next quarter on R&D and other CAPEX.  

II. Gloom for Q4, But It Could be Worse

Looking ahead, while NVIDIA's Q3 results mirror Qualcomm's, its Q4 estimates are gloomier than its rivals.  NVIDIA estimates that revenue will dip to between $1.025B and $1.175B USD on a slowing global economy, versus the traditional bump in the holiday season.

One possible reason why NVIDIA is more worried than Qualcomm is that much of its earnings are still driven by sales of high-end (Kepler) hardware (GPUs) for traditional consumer and enterprise systems.  When the economy slumps, these sales tend to suffer the most, as users consolidate their buying power towards cheaper mobile devices.  In that regard, a mix mobile/traditional chipmaker like NVIDIA will likely be hurt more by a downturn than a solely mobile-centric chipmaker like Qualcomm.
GTX 580 3/4 view
A slowing economy is expected to dent NVIDIA's Q4 earnings.

However, NVIDIA's better-than-expected earnings do represent good news in a couple of ways.  First, NVIDIA and Qualcomm represent a reasonably good barometer by which to gauge the health of the mobile market.  And by the looks of it, mobile is flourishing at a time when other less fortunate markets find themselves facing tough financial questions.

Second, NVIDIA enjoys a key psychological advantage over Qualcomm in that it's pumping out quad-core units of its latest and greatest Tegra 3 processor, while Qualcomm's current Snapdragon 4 offerings are mostly constrained to dual-core CPUs.  That's a big reason why NVIDIA is thus far outpacing Qualcomm in design wins in the emerging Windows RT ARM-based laptop/hybrid device market.  

Traditional PCs demand more power, and NVIDIA has been the most aggressive about push higher core-counts in its mobile chips.  That decision will likely pay off for the company, and help it ride out the storm ahead in the discrete graphics market.

Sources: NVIDIA, FT [analyst estimates]

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

half-empty vs cup half-full
By Silver2k7 on 11/9/2012 10:48:49 AM , Rating: 2
"In that regard, a mix mobile/traditional chipmaker like NVIDIA will likely be hurt more by a downturn than a solely mobile-centric chipmaker like Qualcomm."

*In that regard it should do much better now that it's added a mix of mobile things to it's revenue stream*

fixed it for you ;P

RE: half-empty vs cup half-full
By othercents on 11/9/2012 11:04:25 AM , Rating: 2
Yeah that absolutely didn't make sense. Typically a more diverse company like NVIDIA does better than one with only one market. However there is a question if GPUs is the correct market for NVIDIA especially since they have fallen behind AMD on price/performance. You still do have the NVIDIA fan base.

I really think NVIDIA would be a good purchase for a hardware vendor like ASUS. Just need to choose properly so not to close the door on other vendors using the Tegra chips you produce.


RE: half-empty vs cup half-full
By Pirks on 11/9/12, Rating: 0
RE: half-empty vs cup half-full
By heffeque on 11/10/2012 9:46:01 AM , Rating: 2
Or AMD GPUs. Their latest ones kick ass at GPGPU.

RE: half-empty vs cup half-full
By TheJian on 11/12/2012 9:05:20 AM , Rating: 2
Only at home, not in the pro market. Nvidia RIGHTLY aimed their home cards at GAMES. Because guess what? We play games more than we fold at home...LOL. We play games more than we try to get bitcoins...LOL. I don't pay $300 for a card to do anything other than play games. Easily had bitcoin party is over, and who the heck runs their gpu's at 100% in todays economy to solve cancer for someone else? :) Run up my electric bill so you can make a billion on your next drug you try to sell me? SCREW YOU company X. :)

Having said that, I'm seriously happy Nvidia decided Diablo3 should run 116% faster than Radeon's instead of being great at solving cancer on my home PC :) Well DUH.

For real work - K5000+Tesla until a Quadro110 comes out. There's a reason the combo will set you back $5K. :)

These are not radeons ;) When you're running a $1500-20K software on your system in the hopes of making millions I don't expect it to run on my puny $300 home card as if it was a $3300 K20 coming next month. They're not designed for the same things.

But if it makes your fanboy side feel better, you just keep saying radeon is faster folding@home...LOL. I've never even installed it, and most gamers don't even know what it is. Ask 10 people on the street what a bitcoin is...ROFL. Heck when fox news asked a bunch of people on the street if obama won the debate (BEFORE it actually happened) most said "obama won..."...ROFL. Followed by why did you think he won last night? "umm, he just had the better ideas, and I like what he had to say"...ROFL...Really? Then they asked, so did you watch the whole thing? "Oh yeah, obama did great"...ROFL. I wish they would have told them it wasn't going to happen until the NEXT DAY to see the look on their faces. People don't even know what's going on today. A week later they asked they asked something like "what do you think of obama picking romney as his running mate"..."It was a great idea, it really shows he bipartisan". Just watch some watters world for some great laughs. People didn't even know who was in the race.

I'd guess if Watters went out on the street and asked people what they thought of the new "gpgpu technology and do you think it will really help us go green and get better gas mileage without OIL?", they'd most likely say "oh yeah those new gpgpu's are great green tech, and will really help us get off oil"...ROFL. 95% of america has no idea what gpgpu even is let alone that AMD has a good home card for it ;) Their 157mil loss and market share loss to Nvidia says "it's the games stupid!". Just like the economy :) Xmas won't be good too AMD, but winning 13/16 games like NV did in the mobile grudge match gets into everyone's memory article after article. That sells HOME CARDS. People think if you have the crown, all your cards must win. It's pretty clear NV has the crown. The 65% market share vs. AMD's share says it all. The $225mil AMD owes GF for Dec31st won't help the quarter for AMD at all either. I expect a 300-400million loss for AMD for xmas Q, and this is supposed to be the BEST quarter for all gizmo/gadget makers. On the other hand I expect NV to have 100-200mil profit. That's a HUGE difference there guy.

AMD's cup is 7/8ths empy, while NV's is 3/4 full :),INTC,N...

AMD 64% chance of bankruptcy in the next 2yrs. It was 51% before the Q loss last month. Note NV is 1% chance (and only 1% because they assign NOTHING 0%). The numbers don't lie. You need to start reading balance sheets along with your benchmarks ;) Just a thought.

I've predicted it on here/elsewhere many times in the last year, amd bankrupt by xmas 2014 or bought. If they wait too long it will only be patents being bought rather than their pipeline (stop cutting prices AMD! You might actually make money then). Even intel went from 1% a few months ago to 5% chance now that everyone is entering the server/desktop market (server 1st, though, NV has Denver for desktop probably earlier than most since it sampled in may on 20nm Samsung Austin TX plant). NV goes to world class fab at 20nm, while the rest fight over crap class TSMC/GF. I can't wait to see what a 20nm Kepler does to noise and heat :) Watch for that next xmas right after Denver hits. Denver is the testing ground for Nvidia's next step, which is Kepler (or some variant) on 20nm at samsung. Samsung will use the much larger Kepler to take up the slack for apple's millions of chips leaving their fabs no matter what Tegra does. All the kepler variants (home/pro) will take up a lot of wafers. At the same time Samsung will get a leg up on Apple's A7 as they try to get TSMC to actually get a chip right for once. Good luck apple/amd/qualcomm ;) TSMC sucks.

RE: half-empty vs cup half-full
By NellyFromMA on 11/9/2012 12:27:35 PM , Rating: 3
There are only two true performance competitors in the GPU space and nVidia is number 2. Considering a decent portion of their profits derive from this market, and their absence in this market would basically divert all performance GPU sales to their competitor-to-be in the mobile space, its hard to imagine why they would get out of their primary market?

Only one company can be number 1, but saying they should get out the market because they are number 2 just seems... idk, not sensisble. Esp if its responsible for generating substantial profits...

RE: half-empty vs cup half-full
By TheJian on 11/12/2012 9:17:56 AM , Rating: 2
Not quite sure who you think is #1 in graphics cards if not Nvidia. AMD keeps posting losses (157mil this quarter) and NV owns 65% market share in this market making 203Mil this quarter and beating street estimates even on the high side.

Nvidia is #1 by market share and profits in GPU's if we're talking cards here.

And there are already 5+ competitors in the mobile space. And AMD's 3yrs too late to get to it. No ARM based product before 2014 at earliest (meaning mid year most likely) so I'm confused by your statements.

Who's #1, and Who's their competitor to be? Everyone that matters is already in the mobile space and it's much easier for the socs to add a 2nd soc and still be equal die size to Intel's chips at less power. Intel has to shrink haswell (which isn't until next year anyway) to 14nm to really be in this race. Atom wouldn't be bad at 22nm but that's not currently on the cards as IVY (haswell soon) etc takes all that production. Again, I'm not getting what you're saying :)

By NellyFromMA on 11/12/2012 2:59:24 PM , Rating: 2
I'm posting based off the context of both prior comments and the story. I did not go research who is at the top this year. It fluctuates pretty often between generations and has not deviated from that regular sway for over a decade of my paying attention so I think my point is it doesn't really matter who is #1 or not, ceding the position is dumb

RE: half-empty vs cup half-full
By bug77 on 11/9/2012 12:38:21 PM , Rating: 2
However there is a question if GPUs is the correct market for NVIDIA especially since they have fallen behind AMD on price/performance.

Lol. Don't look here then: (in this instance, it's 660Ti vs 7950, just a random pick).

RE: half-empty vs cup half-full
By JonnyDough on 11/10/2012 2:54:46 PM , Rating: 2
That's from mid August, which means the data is likely from late September/early August. At least use relevant information if you are going to attempt to be a good fanboy.

RE: half-empty vs cup half-full
By spaced_ on 11/12/2012 2:43:35 AM , Rating: 2
I suppose in August/September everyone was buying AMD GPUs based on today's relevant data.

Perhaps current performance indicators indicate it's better to be an AMD fanboy. Perhaps tomorrow's will indicate it's better to be an Nvidia one.

Perhaps business and market trends work a little differently to the latest desktop GPU to GPU comparison on the latest review site.

Perhaps I'm making a statement, or perhaps I'm questions, or perhaps I'm doing neither.

RE: half-empty vs cup half-full
By spaced_ on 11/12/2012 2:44:46 AM , Rating: 2
*posing questions

RE: half-empty vs cup half-full
By TheJian on 11/12/2012 10:03:40 AM , Rating: 2
By all counts AMD is getting their clock cleaned.

Data from mid august can't actually come from Sept can it?...LOL. Sorry, couldn't resist. You're make this too easy.

Assuming I get your point and take it...Got any data to show AMD is winning the war? The 157mil loss for AMD? Is that good? The 203mil profit for NV? Is that bad? 64% chance by financial models say AMD will be out of business by in the next 2years. NV's chances = 1%. Which would you bet on?

Mobile grudge match, shows NV wins 13 games, and AMD wins 3 (and 2 of those 3 suck in the ratings, hence NV ignored optimizing for them). NV 116% faster in Diablo3. Yeah that's bad news. But only for AMD.
Dec 2012. 4x more apps optimized for it. Bad for AMD.
Project Denver/Boulder moving into AMD/Intel's space finally. Umm, bad for AMD cpu's. Many others also moving here. Bad for both Intel & AMD all around.
NVDA kicked AMD's A$$. Oct 15th too old for you too? Relevant data correct?
65% share of the market in gpu cards for NV. Last time I checked that means anyone else is sucking. I don't care who's side you "LIKE", if I'm buying a stock and have half a brain in my head I run FROM AMD and run TO NVDA.

The only really relevant data I need is roadmaps and balance sheets. CEO statements in the financial call's say a lot too. AMD's sucked. NV was cheering. I love AMD but wouldn't touch their stock with YOUR 10' pole, let alone my own :)

Please, by all means. If you have something in DATA that says otherwise point to it, because I can't see it anywhere I look. All I see is absolute and unequivocal failure for AMD. 30% layoffs in ~2yrs also.
2 more execs jumped ship:
The list of people who've left of been laid off is IMMENSE. It took ages for them to find anyone who'd take the ceo job. Rory Read isn't a top choice. One wonders what they had to offer to get him even...They should have picked another Dirk Meyer, or Jerry Sanders. Those two created great chip tech and some actual profits. They need an engineer running things. A guy from IS should be UNDER the Engineer running the company.

Andy grove, Craig Barret, Dirk Meyer, Jerry Sanders, Robert Noyce, Gordon Moore! All Chemistry, Engineer, Science, Physics. These people create products. Others just sell them. You can't win without the former. The latter don't create squat. Laying off tons of engineers doesn't help the cause either. Rory should have let go of management, and kept engineers at all costs. Good luck hiring new engineers when most feel you're on your way to bankruptcy. That's not really called a stable job in a scary economy. With obama and co driving us into the ground faster each year, you need to take a job at a company that has a GREAT FREAKING balance sheet or worry about your job getting cut every week.

Nvidia GPU market
By sharkbaitnate on 11/9/2012 12:43:44 PM , Rating: 1
Unless you have been ignoring Kepler GPUs, Nvidia actually leads the GPU market and has better price/performance than their competition. Of course, this statement is made with a disclamer.

AMD has lower performing GPU technology with higher price versus performance numbers. They also have smaller market share than Nvidia. That is, if you talk about discrete cards only. If we include integrated systems, then you have to include Intel and integrated AMD systems, and you find Nvidia in 3rd. Should these integrated systems be included in GPU assessments? From a future looking business perspective, sure, as they will have a real impact on sales of discrete cards. But from a market standpoint? Not at all. An integrated chip can't be included in GPU sales numbers because it is not a GPU. A GPU is an optional component of a computer that improves performance. You can run the system on CPU computational power only, with integrated chipsets, but this is not truly a GPU system, as the CPU is still doing the heavy lifting.

Only below the Geforce 650 is AMD a better option for the price point.

RE: Nvidia GPU market
By ipay on 11/11/2012 3:22:03 AM , Rating: 4
Actually with the latest drivers (Catalyst 12.11) AMD has higher performance across the board in discrete cards.

Add to that their new triple A game bundle for HD7XXX cards and they have both the best value and performance for anyone looking to upgrade.

RE: Nvidia GPU market
By spaced_ on 11/12/2012 3:23:01 AM , Rating: 2
This only seems to show desktop GPUs.

What about the rather large mobile GPU space?

RE: Nvidia GPU market
By TheJian on 11/12/2012 11:40:20 AM , Rating: 2
Against ref cards. Which only a fool buys as they're no cheaper than heavily OC's Nvidia's out of the box. Does anyone actually sell REF clocked nvidia cards? So I take those benchmarks with a pinch of salt at least. They could at least use a retail card. If you don't get an extra 10-15% free out of the box (no need to modify, they're SOLD OC'd out of the box) you're an idiot.

And not with R310 drivers - Gee didn't take long did it? Today...LOL:
Nvidia's answer to AMD's driver no doubt ;)
GeForce GTX 680:

Up to 26% in Call of Duty: Black Ops 2
Up to 16% in Battlefield 3
Up to 18% in Assassin's Creed III
Up to 9% in The Elder Scrolls V: Skyrim
Up to 6% in Medal of Honor: Warfighter
Up to 6% in StarCraft II
Up to 6% in Dragon Age II
Up to 6% in Batman: Arkham City
Up to 5% in S.T.A.L.K.E.R.: Call of Pripyat

GeForce GTX 660:

Up to 24% in Call of Duty: Black Ops 2
Up to 10% in Battlefield 3
Up to 7% in The Elder Scrolls V: Skyrim
Up to 5% in Dragon Age II
Up to 5% in Assassin's Creed III
Up to 4% in Batman: Arkham City
Up to 4% in Medal of Honor: Warfighter

Who buys a reference clocked card on purpose such as the ones used in your link? Does the driver fix the noise and heat that comes with these radeons vs. NV cards? Nope. Does the driver magically give you Cuda or Physx? Nope. Will any driver EVER fix heat noise or ever give you cuda or physx? NOPE. Will the driver magically fix their profits? Nope. NV's driver will be benchmarked in a few weeks after beta and it will be the same story again. Adding more free games just takes more away from profits. This is essentially yet ANOTHER price cut killing their bottom line. It's like taking $30 off their top cards (i figure they're paying around $10 each). STOP NOW AMD. YOU'RE GOING BANKRUPT. You simply can not win a price war with a company that has more money in cash than your entire company is worth. You will LOSE. Never mind the fact you have 2.05B in debt and they have ZERO. Quit GIVING away your cards and start charging MORE or say hello to your Q4 xmas loss. How dumb are you people at amd? And for those cheering the free crap/price cuts constantly it all comes with a price tag. NO COMPETITION IN 2YRS. Which means $500 cards that currently cost $100-200. We don't need AMD going bankrupt. But they seem driven to go there as fast as Obama wants to drive us off the fiscal cliff. Cheer for the free stuff all you want, but realize once it runs out and AMD is gone (like welfare we can't afford) you'll all be paying a far higher price for the next video card.

47million on food stamps and welfare for 20mil illegals=America bankrupt shortly.
3 AAA games free and a half dozen price cuts this year=AMD bankrupt shortly.
Oh and 20% off warfighter...So basically another $10 cut from AMD's top cards.

Retards. You'll all regret voting for a fool handing out welfare to 50% of the country (+20mil illegals free medical/food stamps about to come online), just as you'll all regret AMD giving away more of their NON-Existent profits. AMD is printing money they don't have, just like Obama.

You can't run a country like that, and you can't run a business like that either. IN both cases idiots were hired to run the show (a community organizer who broke Chicago has no idea how to create jobs, he's never run a business - Rory read doesn't design chips or any other product - same story). And in both cases, it won't be long before the SHOW is over.

T3 golden days..
By rocketbuddha on 11/9/2012 12:46:02 PM , Rating: 3
Second, NVIDIA enjoys a key psychological advantage over Qualcomm in that it's pumping out quad-core units of its latest and greatest Tegra 3 processor

One of the main reasons NVIDIA had success getting Tegra3 into phones is because of the supply constraints of 28nm Wafers at TSMC which caused Qualcomm to not be able to supply the demand for its Krait based SOCs.

And Samsung only provides its Exynos 4 which is superior to T3 in every way to Samsung Mobile and a few Chinese OEMs, NVIDIA was able to covet everyone who could not get S4 allocation...

Performance wise Dual core S4 Krait SOCs were better than T3 across a multitude of tests.

If only TSMC had the 28nm capacity, then Tegra 3 would have lasted like Tegra2 (as soon as any competition was available T2 disappeared from phones). Now with S4 Quads becoming availble with far better Adreno 320 cores, T3 is no longer premium.

That's a big reason why NVIDIA is thus far outpacing Qualcomm in design wins in the emerging Windows RT ARM-based laptop/hybrid device market.

Qualcomm's killer strength is actually its radio baseband technology which is not a priority in the tablet market.
Infact QComm has never been successful in the Tablet market (except for a failed HTC Flyer which tablet was using a QComm SOC??).
But the new S4 Quads (LG Optimus G, Nexus4) are only APQ aka SOCs with no baseband. And this can enter into Tablet market if priced right.
Add to it we have the Cortex-A15 based Exynos 5 and if Samsung is able to sell it to their competitors would put severe dent in T3 tablet expansion till they come up with the (A15 based??) T4

It does temporarily have success in the tablet market, as being the SOC in Nexus7 and Surface RT which are guaranteed to sell pretty well among non-"i" tablets.

I more believe that the Q4 forecast is related to the reality that competition has better products in the market and NVIDIA has already milked the 40nm T3 for all its worth..

RE: T3 golden days..
By killerroach on 11/9/2012 3:51:11 PM , Rating: 2
You figured out what Nvidia's advantage is, although you've danced around it... they have an incredible first-mover advantage in the market. Sure, their SoCs get crushed by their generational equivalents from Qualcomm and Samsung, but they're out first, which allows them to get a lot of design wins and into the hands of a lot of consumers before everyone else. They're almost like the cinematic trailer for each new generation of mobile SoCs - sure, the feature is almost certainly better, but you get to see the trailer in theaters a few months prior.

RE: T3 golden days..
By TheJian on 11/13/2012 7:00:16 AM , Rating: 2
Which is why they're gearing up 20nm at samsung, sampled Denver clear back in May on 20nm. It takes around 9 months from sample to chip I think, so ~Feb/March products. Just in time for the next rev of everything. Again early to the party to get into tons of designs for xmas etc. I don't think anyone else will have anything below 28nm before 2014 (GF/TSMC are 2014 for 14nm and not early 2014, that will be INTEL, not these guys). So NV alone with Samsung on 20nm unless

Worse, QCOM will be fighting someone who can BRIBE TSMC to make more chips for them than Nvidia ever could. APPLE. They already tried a Billion dollar bribe for 100% exclusivity to try to secure chips for them and their next launch.

So apple now causes problems for all the other players while leaving Samsung having lots of wafers not being used by apple. How do you fill that void with something you can COUNT on needing wafers? The #1 producing GPU maker of HUGEMONGOUS chips. Kepler is 3x the size of a soc. Kepler owns 65% of the gpu card market. Nobody in sight to knock them off but AMD who is bleeding to death and a small share compared to NV. You can count on NV needing 65% of the market again next year, etc. IF you went with some soc maker like marvel, Qcom etc, you can't count on them getting into a phone like iphone and being the next hit. Too many others to knock them off. So why not go with someone who is LOSING competition and has arguably cornered their market? Samsung wisely chose NV, who has a decent volume of tegra need and a fully steady and reliable need of big gpu's with no end in sight.

Everyone else gets hurt by the move. AMD still depends on crappy tsmc for gpus. Nvidia just got a leg up with samsung at 20nm and gets stuff out ON TIME and more importantly GOOD YIELDS doing it! Apple couldn't launch huge product volumes with TSMC who fails at every turn. Kepler is just now getting top to bottom because VOLUME/YIELDS at TSMC on 28nm has sucked for ages. Same at 40nm, etc...They mess everything they do up for 6 months or more. Which screws everyone's launches. The OP was correct, you're stating why NV has an advantage, and it looks like it just got better. First to next movie, and on a better process by all counts.

You're comparing a 40nm old Tegra3 to the latest and greatest at 28nm. You act like NV has no answer. Jen was quoted in a conference call saying they dev'd Tegra 3/4/5/6 all at the SAME TIME! Denver popped out samples in May in Austin at 20nm. Are you doing the math yet?

You're building the case for why NV will continue to succeed. Even QCOM said what you just said. NV only got designs because they released first...Well DUH. If nothing else is available and Quad sounds better than dual then quad wins. :) The move to do a big little design so to speak at 40nm won designs and is still doing it. You can just now start to buy a quad snapdragon/samsung.

Depends on the test on who wins what. CPU's Tegra still does OK. Graphics they've been surpassed but I don't think that will be the case when we benchmark GAMES instead of the OFFSCREEN crap they do at anandtech etc. Why benchmark what you don't see...LOL. Games are coming based on unreal3 and I suspect at least ONE will want to be the gaming benchmark everyone uses and have a timedemo or something built in for a benchmark. This will separate the producers of a chip, from the producers who ALSO work with the software dev that makes games that run on said chip. NV is king here. Witness them winning 13/16 games vs. amd in the laptop grudge match at anandtech. They stomped AMD and it can't be all because of hardware. NV beating AMD by 116% in Diablo 3 has a lot to do with optimizations and working with the devs.

Until games come (real ones, not Angry birds 10...LOL) nobody knows anything but battery life while browsing/watching movies really.

Where's the beef?
By Motoman on 11/9/2012 10:32:39 AM , Rating: 5
NVIDIA Mirrors Qualcomm's Hot Steak With Earnings Surprise of Its Own


"Let's face it, we're not changing the world. We're building a product that helps people buy more crap - and watch porn." -- Seagate CEO Bill Watkins

Latest Headlines
Inspiron Laptops & 2-in-1 PCs
September 25, 2016, 9:00 AM
The Samsung Galaxy S7
September 14, 2016, 6:00 AM
Apple Watch 2 – Coming September 7th
September 3, 2016, 6:30 AM
Apple says “See you on the 7th.”
September 1, 2016, 6:30 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki