Print 21 comment(s) - last by TakinYourPoint.. on May 27 at 5:09 AM

NVIDIA adds an interesting new price point, but does it release Titan demand, or simply create confusion?

2013 is going to be a big year in the graphics processing unit (GPU) arena, with the arrival of Intel Corp.'s (INTCHaswell, which brings the company's strongest on-die graphical showing yet -- Iris Pro.  For graphics card maker NVDIA Corp. (NVDA), the pressure is on to continue to convince the spectrum of gamers from casual to enthusiast that they need to keep buying discrete GPUs.  

NVIDIA appears to be responding by shifting its focus to higher-priced, more powerful GPUs.  And thus far the strategy appears to be working.  Will this continue with the company's latest card?  Let's dig in.

I. Meet the New Middle of the High End

For all intents and purposes, the GeForce GTX 780 is a lower priced, lower performance (due to having slightly less CUDA cores, etc.) version of NVIDIA's Titan that launched back in February.  Even the sleek silver cooler vaguely resembles Titan's metallic cooler.

In another light, the GTX 780 represents a pricier, more powerful GTX 680.  Both have Kepler series GPU chips built on Taiwan Semiconductor Manufacturing Comp., Ltd. (TPE:233028 nm process, but the GTX 780 uses a cut-down GK110, where as the GTX 680 uses a fully functional, but less advanced GK104.  Priced at $650 USD, the card is roughly $200 USD more expensive than the GTX 680, which currently hovers around $450 USD.

GTX 780
The NVIDIA GeForce GTX 780

The GTX 780 keeps the same number of ROPs, but cuts the video RAM in half from the Titan -- down to 3 GB.  But the card has 2 SMX units (192 CUDA cores; 16 texture units) disabled per chip.  The decrease in processing units has allowed NVIDIA to bump the clock speed, so the cores on the card are actually clocked faster than the Titan.

II. Between the GTX 690 and the GTX 680

Here's a recap of the specs:
GPU high end
Pricing              <---------High---------------------> Mid-High  <-------Middle-------->
(Click to enlarge)

As mentioned, the GTX 780 comes between the Titan/GTX 690/Radeon HD 7990 and the GTX 680/Radeon HD 7970GE.

III. Benchmarks Tell Tale of Confusion, Competition

In testing by AnandTech, the GTX 780 generally delivered about 90 percent of the performance of Titan; impressive considering that it's only 65 percent of the price.  On the flip side of the coin, versus Advanced Micro Devices Inc.'s (AMD) Radeon HD 7970GHz Edition, the GTX 780 is roughly 44 percent more expensive, while only giving a 22 percent performance bump.

In other words, the graphics market is still very competitive with AMD keeping competitive at the  $1000 USD (GTX 690 vs. Radeon HD 7990) and $450 USD (GTX 680 vs. Radeon HD 7970GE).

It remains to be seen whether the GTX 780 delivers on NVIDIA's hope -- releasing pent-up demand for the more-popular-than-expected Titan/GTX 690 at a lower price point -- is fulfilled, or if the pricey new flagship GTX x80 model simply confuses so called "prosumers".

Adding to the confusion is the one thing not mentioned until now -- a pair of GTX 680 or Radeon HD 7970GE costs $900 USD and consistently beats both the GTX 690 and Radeon HD 7990 in performance.  So while the market is very competitive, consumers looking to buy on the mid-to-high end must navigate a confusing myriad of options.

Sources: NVIDIA [press release], AnandTech [benchmarks]

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

RE: This wasn't such a great idea IMO
By TakinYourPoints on 5/25/2013 11:29:02 PM , Rating: 3
Fast forward and what we have now are a couple graphics card companies that push the limits with $600+ cards. They are milking the now grown up computer gamers that have a little more income to work with but are ignoring future gamers. Game developers target the widest audience so that these cards aren't even utilized except by the few that have multiple monitor setups (probably somewhat common among people reading dailytech, but it's not really a mass appeal thing).

The problem with this argument is that mid-range and low end video cards are more than enough for the mass market.

These high end $400+ cards are for extreme niche cases. They're specifically for people who game at 2560x1440 or 1600, or people who do multiple monitor gaming with three displays.

As of today, if you're gaming with a standard 1920x1080 display then a $200-$300 card is perfectly fine. 1920x1080 is the majority of gamers based on the Steam hardware survey.

Resolution is the main determining factor with cards these days. 1920x1080, 1680x1050, 1600x900, 1280x1024, and 1366x768 are the most common resolutions on Steam's survey. 1920x1200, which is also fine with something like a GTX 660 Ti, is only 3%, a much lower number of users compared to those other resolutions.

I game at 2560x1440 and I'm in a niche, less than one percent. Multimonitor resolutions make up less than a half-percent altogether.

Blaming the death of PC gaming on niche products that only move a few thousand units at most is missing the big picture, especially given that PC gaming is as big as its ever been.

RE: This wasn't such a great idea IMO
By TakinYourPoints on 5/25/2013 11:32:28 PM , Rating: 2
Like, the only reason I have a GTX 680 is because I'm gaming on a 27". If I was still gaming on my 23" then I'd totally go with something less powerful.

High end cards really aren't necessary for most people, and they can get away with a lot less power while still getting a great experience.

RE: This wasn't such a great idea IMO
By Piiman on 5/26/2013 5:00:48 PM , Rating: 2
Then you wasted your money.I have a 680 and use it with three 23 inch monitors. Seriously you thouht you needed to go highend for 4 more inches? Go buy some more montiors. :-)

By TakinYourPoints on 5/27/2013 5:09:22 AM , Rating: 3
There's a huge difference in needed horsepower between 1920x1080 and 2560x1440, you should know this. :)

I have a 24" 1920x1200 monitor just as a second monitor, and I'm fine with all that. No interest in triple-monitor gaming either. I don't like the gaps from bezels, nor do I like the distortion on the left and right monitors. Gaming on one 27" is perfect for me, that or a 30".

"I mean, if you wanna break down someone's door, why don't you start with AT&T, for God sakes? They make your amazing phone unusable as a phone!" -- Jon Stewart on Apple and the iPhone

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki