backtop


Print 39 comment(s) - last by derricker.. on Nov 12 at 12:51 PM


The GTX 580 is the most powerful single-GPU card solution, according to early testing. It delivers approximately a 10 percent bump from the GTX 480, but costs about 20 percent more.  (Source: T-Break)
Higher retail prices mean that you spend about 20 percent more to get a 10 percent performance boost

Today NVIDIA officially launched the GTX 580, the first card in its Geforce 500 Series.  Like AMD's Radeon 6000 series, the Geforce 500 series isn't a major architecture design and is still produced on the same 40 nm process by Taiwan Semiconductor Manufacturing Company, Ltd.

Despite the fact that the card itself isn't exactly earth-shaking, this launch is clearly a big deal for NVIDIA, as it represents the company closing the gap on release time with AMD.  And the card reportedly picks up right where NVIDIA left off, boosting the company's high end performance even higher.  But how big an impact will it have?  Continue ahead for our thoughts.

The Card

The Geforce GTX 580 is absolutely a performance beast.  In some regards, it is basically an overclocked GTX 480.  But it also bumps the Shader Processor (aka "CUDA core") from 480 (with the GTX 480) to 512.  It also tacks on 4 extra "Special Function Units", taking the total to 64.

The die size has actually shrunk slightly to 520 mm^2, while the transistor count stayed steady at 3 billion. 

Clock speeds have been bumped up across the board.  The standard core clock jumps from 700 MHz to 772 MHz.  The shade clock is bumped from 1.401 GHz to 1.544 GHz.  And the GDDR5 memory clock is pushed from 3.696 GHz to 4.008 GHz.

Despite the clock speed increases, the TDP is down from 250 watts to 244 watts.  Impressive.

NVIDIA dubs the new architecture GF110, but essential this appears to be very similar to the GF100 architecture in the GTX 480.

Other than the clocks and number of processor units, pretty much everything else stays the same.  The memory bus is still 384-bits wide and there's still roughly 1.5 GB of it.

The clock increases yield roughly a 4.1 percent increase in pixel fill rate, a 17.6 percent increase in texture fill rate, and an 8.5 percent increase in memory bandwidth.

Clearly things are moving in the right direction for NVIDIA.

Moving on from the base electronics, NVIDIA's biggest addition is a new vapor-chamber cooler.  Vapor-chamber coolers are advantageous in that they run quiet and cool.  In fact, early reports are indicating that NVIDIA's cooler is indeed as effective as it claims at this.  The downside is that they can suffer corrosion issues you wouldn't get in a standard cooler, plus cost more -- for the hardware maker, at least (more on price later).

The card requires a 6-pin and an 8-pin power connector.  Up to four GTX 580's can be chained up in SLI.

Do Any of You Guys Want to go Fast?

NVIDIA and AMD clearly have different priorities.  If you "wanna go fast" NVIDIA appears to have the edge with its current lineup.  If you want the most bang for your buck, AMD looks to hold a slight lead.  With that regard NVIDIA is clearly prescribing to the Ricky Bobby mindset -- "if you ain't first, you're last".

The bad news is that the GTX 580 isn't the fastest card on the planet.  The good news, for NVIDIA is the GF110 is the fastest GPU on the planet, it appears, and the GTX 580 is the fastest single GPU card.

T-Break was the first to publish benchmarks for the card (they were the only ones bold enough to break the embargo).  They found that the new card, in DirectX 11 testing outperforms both the GTX 480 and AMD's Radeon 5870 2 GB model by at least 10 percent.  It's about 10-15 percent slower than the dual-chip AMD Radeon 5970 -- still an impressive feat.

At DirectX 9 and 10 benchmarks, the GTX 580 furthered its lead, beating even the Radeon 5970 at Farcry 2 and Street Fighter IV.  It earns roughly a draw at Starcraft II, which may not be overly meaningful, given that this is a very CPU-limited, GPU-friendly title.

The card also appears to be a decent overclocker, with T-Break reporting a stable overclock of 8 percent (to 832 MHz).

Is the Price Right?

The NVIDIA GTX 580 is clearly a tempting card, particularly for those enthusiasts who still feel some love for old green.  But can the company be competitive with pricing?

When we aired our piece on the upcoming card yesterday, many commented that the card was going to be "$600" and hence a poor deal.

Those fears seemed largely unwarranted.  NVIDIA is officially pricing the hot card at $499 USD.  While NewEgg.com  lists all of the cards at $570 USD or more, a 10 percent promo code takes the cost down to around the $500 target price ($520 to be more precise).  Considering that you can snag a GTX 480 for as cheap as $429 USD, the overall picture becomes a bit more clear -- you can get a roughly 10 percent real world performance increase by paying roughly 20.9 percent extra.  That's reasonably competitive in the enthusiast world. 

That said, NVIDIA and AMD clearly have different operating philosophies.  AMD launched its first 6000 series cards (in the Barts subfamily) -- the Radeon 6850 and 6870 -- late last month.  Really, these cards couldn't be farther from the GTX 580 in terms of target audience.

The Radeon 68[57]0 retail for a target price of $179 USD and $239 USD respectively.  That means that you could pick up two (!) Radeon 6870s, and still pocket $40, for the price it would take you to obtain the beastly Geforce GTX 580.

NVIDIA has been relatively competitive in dropping the price of its lower end Geforce 400 series models, but AMD seems to be doing a bit better in the price department.

Conclusions

Looking ahead AMD will soon launch its new and improved performance lineup -- led by the Cayman (single GPU) and Antilles (multi-GPU) subfamilies.  These cards should be the real test of the GTX 580's dominance, particularly the Radeon HD 6970 (Cayman XT).

It might be wise for enthusiasts to wait a couple weeks if you don't absolutely need the GTX 580 right now.  Once AMD launches its high end counterpunch, the prices of the new Geforce GTX 580 will likely drop.

By the same token, NVIDIA is almost invariably working on more budget-friendly Geforce 500 series entries.  Unlike Antilles and Cayman, though, much less info is available on when these budget models might launch.  So wait at your own peril.

At the end of the day, though NVIDIA has cut a roughly eight month lag (between its GeForce 400 series and AMD's Radeon 6000 series) to less than a month.  Clearly things are shaping up at old green.  Ultimately, though, the company may come to regret not targeting the higher volume market first, as AMD is doing.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Triple Screens?
By Mitch101 on 11/9/2010 10:11:48 AM , Rating: 2
Do you have to buy to of them for triple screens? I haven't seen a review that you can unless you buy two of them.

Not trying to dis NVIDIA but my $130.00 AMD/ATI Radeon 5770 runs triple monitors and good framerates on most games at 5760x1080 resolution. Sure I have to drop the resolution down for some games (4320 x 900) but I wouldn't give up widescreen gaming if it requires two of these $500.00 cards to do it.

If a single $500 video card cant do triple screens I personally think my $130.00 video card purchases was a better one.




RE: Triple Screens?
By kattanna on 11/9/2010 10:37:13 AM , Rating: 2
yes, you will need 2 video cards for 3 monitors


RE: Triple Screens?
By Mitch101 on 11/9/2010 11:01:47 AM , Rating: 2
Kudos to anyone who can afford to pay $1000.00 on video cards.

I dont know a single person who really wants to play single monitor mode after playing in triple monitor mode/Widescreen gaming. Triple monitor support should be a standard option for gaming video cards today.


RE: Triple Screens?
By nuarbnellaffej on 11/9/2010 11:40:56 AM , Rating: 2
quote:
I dont know a single person who really wants to play single monitor mode after playing in triple monitor mode/Widescreen gaming. Triple monitor support should be a standard option for gaming video cards today.

I would argue that a three monitor setup isn't practical for most people, but then again neither are enthusiast level cards, that does seem kind of odd they haven't implemented it yet.


RE: Triple Screens?
By Mitch101 on 11/9/2010 12:18:16 PM , Rating: 2
Your probably right total cost for 3 monitors, video card, and DP to VGA adapter was around $500.00 total. A single monitor feels like your wearing blinders compared to triple screens but I love it. Best upgrade since my first 3DFX Voodoo video card.

AMD/ATI had triple screens in mind when they began their design. For NVIDIA it was probably more reaction to AMD/ATI coming out with triple screens on a single card.


RE: Triple Screens?
By theapparition on 11/9/2010 1:21:19 PM , Rating: 2
$500 for everyting?

What kind of 1920x1080p monitors are you getting for $120?


RE: Triple Screens?
By nuarbnellaffej on 11/9/2010 2:12:34 PM , Rating: 2
Im' sure it's a typo, though if he says he actually meant $5,000 I might die of envy. :p


RE: Triple Screens?
By Mitch101 on 11/9/2010 2:27:10 PM , Rating: 5
How I did it
Wait for a 21.5" monitor to go on sale at staples like below.
http://njdevil.com/index.php?item=E2236VW
Or Similar monitor. I got 3 21.5" Acers for less than that.

Get the Staples $25.00 off $100.00 on e-bay.
Monitor $129.00 - Rebate $25.00 = $105.00 usually with free shipping too. Add Tax too so lets say $112.00 each x 3 = $336 in monitors

Go to Newegg and get a Radeon 5770 for about $130.00
Monitors $336 and $130 video card = $466.00

I dont recall the exact DP to VGA adapter I bought it was on the approved list to support 1920x1080p and I believe I got it from Amazon.com for around $32.00.
$466+$32 = $498.00


RE: Triple Screens?
By Mitch101 on 11/10/2010 10:37:35 AM , Rating: 2
Here is the Approved list of adapters - Slow Load
http://www.accellcables.com/products/DisplayPort/D...

Here is the one Im using - Note the reviews.
$24.99 & eligible for FREE Super Saver Shipping
Accell UltraAV B101B-003B Display Port with VGA Active Adapter
http://www.amazon.com/Accell-UltraAV-B101B-003B-Di...

Now I remember buying something else with it to get the free shipping that's why it was $32.00 in my head.

Also a Radeon 5770 is now $110.00 at NewEgg. I play a lot of Left 4 Dead in triple screen view. The sides take a little getting used to being that some items appear closer but you get used to it.


RE: Triple Screens?
By symbiosys on 11/9/2010 5:25:42 PM , Rating: 2
I run the exact same setup as you mate, my 5770 runs 3 monitors with ease, with more than enough grunt to power most games :) LOVE IT!


No problem waiting...
By JBird7986 on 11/9/2010 9:41:05 AM , Rating: 2
quote:
At the end of the day, though NVIDIA has cut a roughly eight month lag (between its GeForce 400 series and AMD's Radeon 6000 series) to less than a month. Clearly things are shaping up at old green. Ultimately, though, the company may come to regret not targeting the higher volume market first, as AMD is doing.


I dunno about that. I was getting ready to pull the trigger on a 1GB GTX 460, but I think that I may wait the couple months for the equivalent 500 series card at the $200 price point. Might as well get into the latest generation, especially if it's going to be out before January.




RE: No problem waiting...
By Da W on 11/9/2010 10:06:00 AM , Rating: 2
Gonna get a second 5770 to run in crossfire for this year for a cheap 139$. 6 monitors, perform like a single 5870, that's gonna be enough for me.


RE: No problem waiting...
By nafhan on 11/9/2010 10:46:55 AM , Rating: 2
The "equivalent" 500 series card is the GTX 460. If they do release a GTX 560, it'll probably just be a 460 with all shader clusters enabled.


to rich for my blood
By hclarkjr on 11/9/2010 9:33:56 AM , Rating: 2
way to pricey for me, will buy AMD card for upgrade




RE: to rich for my blood
By Slaimus on 11/9/2010 1:34:46 PM , Rating: 2
I thought so too, and was close to pulling the trigger on the 6850 when the price of the GTX 460 bottomed out. Ended up buying a GTX 460 768MB for $130 after rebate. I don't know how is Nvidia even able to do this, with a chip that is bigger than Cypress.


Everybody wins, except Nvidia
By Phoque on 11/9/2010 6:01:18 PM , Rating: 3
Nvidia customers: win
Nvidia : lose
AMD customers: win
AMD : win

Why does Nvidia lose? Because AMD is the one who will dictate price of the graphics cards. Their profit margin per card is much bigger than that of Nvidia.

~~~~~~~~~~~~~~

Transistors vs performance wise:
Nvidia has a great GPGPU and a very good GPU.
AMD has a great GPU ( and a ? GPGPU ? ).

~~~~~~~~~~~~~~

Most of the money still comes from the GPU market. That`s why Nvidia is in trouble, IMHO, with their architecture. It will always take them more transistors to match or surpass AMD`s performance on the GPU side, and that ends up eating their profit.

Actually, from design to last chip sold in some time from now, I'm wondering if Nvidia will even make a penny out of this whole FERMI ( 400/500 ) adventure. Obviously, all is not lost and they will leverage that experience in future generations.




Keep pushing!
By therealnickdanger on 11/9/2010 9:40:41 AM , Rating: 2
I'm glad to see that their new cooling method has real-world benefits. The noise reduction is very impressive. The overall performance is not so impressive, but I'm sure people with cash to burn will be more than willing to upgrade!




happy to be wrong
By kattanna on 11/9/2010 10:39:12 AM , Rating: 2
quote:
many commented that the card was going to be "$600" and hence a poor deal


and I was one of them. but i am happy to be wrong.

http://www.hardocp.com/article/2010/11/09/nvidia_g...

there is a nice review of the card




By mmcdonalataocdotgov on 11/9/2010 11:42:53 AM , Rating: 2
10% off of $570 would be exactly $513, not roughly $520, which, you could add, is even closer to the $500 suggested retail. That's more words.

And that means that you would pay exactly 19.58% more for the 10% increase, let's call it 20%, as opposed to "roughly 20.9%." But that's less words, so you would still have some analysis to do for the word count to be high enough.

</dick>




3 for 1
By mavricxx on 11/10/2010 11:31:00 PM , Rating: 2
I'm going with ATI/AMD on this one, since I can do triple monitors with one card.




Hmm
By Silver2k7 on 11/11/2010 8:02:17 AM , Rating: 2
[quote]If you "wanna go fast" NVIDIA appears to have the edge with its current lineup.[/quote]

The AMD(Ati) card competing with the Nvidia 580 would be the 6970 (replacing Ati 5870 as fastest AMD single GPU card) wich is to be released shortly... so unless you have 6970 benchmarked already how can you tell ??




By derricker on 11/12/2010 12:51:50 PM , Rating: 2
By releasing the full enabled version of fermi chip to the desktop market, they have given AMD a reason to keep their prices down and avoid the whole overpriced 58xx series release we saw in last generation.

The fanboys need to stfu, we need both companies alive for *our benefit* as consumers.




Nvidia is desperate
By Gungel on 11/9/10, Rating: -1
RE: Nvidia is desperate
By Shig on 11/9/2010 9:39:40 AM , Rating: 2
I wouldn't say AMD is doing that well either, considering they have to compete with Intel too.

They're both a little desperate if you ask me.


RE: Nvidia is desperate
By Da W on 11/9/2010 10:08:28 AM , Rating: 2
We shall see with AMD. I have high hopes for Bulldozer and some sort of hybrid-crossfire mode with the AMD graphic card. Just a little push for a bit more performance Nvidia and Intel can't match. Probably not until the 7870 though.


RE: Nvidia is desperate
By Cheesew1z69 on 11/9/2010 11:27:52 AM , Rating: 2
They compete with on-board video? Does Intel even MAKE a dedicated GPU? No?


RE: Nvidia is desperate
By HrilL on 11/9/2010 1:30:06 PM , Rating: 3
Compete? Hardly. AMD IGPUs are in a whole different class compared to Intel. Intel just happens to have a strangle hold on the chipset market for their CPUs. Nivdia is not allowed to make chipsets for the i# line of CPUs and I don't believe AMD is either. Both can't even get into that market to attempt to compete and they'd likely dominate Intel since they make the worst IGPUs on the market.


RE: Nvidia is desperate
By spread on 11/9/2010 10:24:16 AM , Rating: 4
quote:
considering they have to compete with Intel too.


Hahaha. That was a good one. I can't wait to get a new Larabee video card for my computer next year.


RE: Nvidia is desperate
By majorpain on 11/9/2010 2:26:37 PM , Rating: 2
what the hell does Intel have to do with this post?


RE: Nvidia is desperate
By Phoque on 11/9/2010 8:12:06 PM , Rating: 2
quote:
They're both a little desperate if you ask me.


Maybe at this time, but I feel AMD is the one in the best position to attack the future. It's got good CPUs, something Nvidia will be cruelly lacking once GPU and CPU are on the same die or package. AMD also got great GPU, something Intel can only dream of, even if it's not as bad as it used to.

Also, CPU processing power is becoming relatively less important these days for the mass market. It is the GPU that is gaining in importance for crunching numbers and rendering graphics. When you build up a computer, do you pay more for the CPU or GPU? For these reasons, I believe AMD has a bright future.


RE: Nvidia is desperate
By Silver2k7 on 11/11/2010 8:07:50 AM , Rating: 2
ok show me the discrete Intel video card, wich competes with the GTX 460 or the GTX 580 ;-)


RE: Nvidia is desperate
By theapparition on 11/9/2010 9:41:57 AM , Rating: 5
Fanboi much?


RE: Nvidia is desperate
By Gungel on 11/9/10, Rating: -1
What a waste...
By jonmcc33 on 11/9/10, Rating: -1
RE: What a waste...
By mmcdonalataocdotgov on 11/9/2010 3:26:46 PM , Rating: 1
I agree. PC gaming is superfluous these days. Why strap yourself with legacy architecture and then load in a high-end GPU just to overcome that, when the gaming consoles do it better, on larger screen, for much less money, as you say.

I would, however, get this card if I had a new rig, but only to run Adobe CS5, which leverages GPU cycles for image manipulation. But not for gaming.


RE: What a waste...
By klutzInMotion on 11/9/2010 4:41:33 PM , Rating: 2
quote:
I would, however, get this card if I had a new rig, but only to run Adobe CS5, which leverages GPU cycles for image manipulation. But not for gaming.


Man, just what kind of truth distortion field are you creating that you need GTX 580 for?


RE: What a waste...
By mmcdonalataocdotgov on 11/10/2010 11:51:16 AM , Rating: 2
Well, man, I generally process images and composites that are greater than 150 - 200MB on disk, and I use a seperate physical disk to run the PS swap file, just to keep it away from the OS swap file disk (OS swap file is hard set to prevent fragmentation and encroachment as well). CS5 leverages 64-bit memory addresses and GPU cycles to manage working (memory) files, which keeps them off the scratch disk. This is crucial when processing images in 16-bit or higher, as I often do, when working from RAW files. Single operations or scripts can take as long as 20 minutes on my current 32-bit rig.

I wasn't talking about 8-bit jpgs of my weiner from my camera phone.


RE: What a waste...
By Silver2k7 on 11/11/2010 11:37:52 AM , Rating: 2
If using multi-monitor setups or 120Hz monitors, or just plain old 2560x1600 resolutions.. you might want as fast a video card as possible to keep min. fps as high as possible. or possibly several of them :)


RE: What a waste...
By cmdrdredd on 11/9/2010 6:00:54 PM , Rating: 2
I largely agree. There's hardly any really good games anymore. A few ok ones here or there and none of them make me say "wow my GTX295 is so slow". I guess the cards are outpacing the developers too quickly when it used to be the other way around. New game = new hardware. Not so much anymore. I haven't even played a PC game in a long time.


“We do believe we have a moral responsibility to keep porn off the iPhone.” -- Steve Jobs














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki