backtop


Print 91 comment(s) - last by soda777.. on Apr 24 at 10:40 PM


DX11 performance as compared to Radeon 5800 series  (Source: NVIDIA)

Exceptional DX11 scaling  (Source: NVIDIA)

Clearly visible are the Dual-DVI and mini-HDMI interfaces
Beats ATI soundly, but will you be able to buy it?

It was almost exactly six months ago that ATI, the graphics arm of AMD, launched the first ever DirectX 11 video card. They've since launched a top-to-bottom lineup of discrete graphics cards, ranging from the Radeon HD 5970, the most powerful graphics card in the world, to the passively cooled Radeon HD 5450 available for less than $50.

Meanwhile its chief competitor NVIDIA has been struggling to prepare its first DX11 part. The company previewed its Fermi architecture in October, but details have been few as the manufacturing problems and yield issues at the Taiwan Semiconductor Manufacturing Company still plague NVIDIA's 40nm process.

The first Graphics Fermi 100 (GF100) chips started production in January, but it wasn't until February that we learned that they would be sold under the GeForce GTX 480 and GeForce GTX 470 monikers. Some specifications finally showed up last week, and today we have the final details confirmed.

The die size of the GF100 chip is massive, measuring 23mm x 23mm for a total of 529 square millimeters. By comparison, the Cypress chips used in the Radeon 5800 and 5900 series cards come in at a more moderate 334mm², making the GF100 almost 60% larger.

However, performance is much stronger, with the GTX 480 and GTX 470 beating the Radeon HD 5870 and 5850 respectively, according to internal benchmarks shown to us by NVIDIA. The tessellation engine in particular is exceptional strong, and the near-linear scaling bodes well for those enthusiasts considering a GF100 SLI setup.

The powerhouse chip consumes an extraordinary amount of power. NVIDIA recommends the use of a 600W power supply with the GTX 480, but that is just a minimum. Enthusiasts who want to prepare for a possible SLI setup should have a 1000W PSU at the minimum, with a 1200W PSU not out of the question.

The most disappointing news is that after all of this waiting, NVIDIA has told DailyTech that readers should not expect volume availability until the week of April 12. There are supposed to be "tens of thousands" of GF100 cards at launch, but production is slow since all of the cards that will be sold during the launch are reference boards built by NVIDIA.

Somewhat surprising is the lack of support for DisplayPort, the next-generation computer display standard set to replace VGA and DVI for desktops and laptops. NVIDIA states that the GF100 supports DisplayPort, but it will be up to its board partners to support it in their own designs.

Instead, the reference design has two Dual-link DVI ports and an almost useless mini-HDMI port. NVIDIA is keen to tout its three interfaces and support for its 3D Vision Surround technology, but users who wish to use more than two monitors at the same time will be required to use a second card. This contrasts strongly with ATI's Eyefinity technology present on all 5000 series cards, which support the use of three monitors at the same time using a single video card.

Ther is no word yet on support for OpenGL 4.0 or OpenGL 3.30.



Graphics Card


GeForce GTX 470


 GeForce GTX 480

Graphics Processing Clusters

4

4

Streaming Multiprocessors

14

15

CUDA Cores

448

480

Texture Units

56

60

ROP Units

40

48

Graphics Clock
(Fixed Function Units)

607 MHz

700 MHz

Processor Clock (CUDA Cores)

1215 MHz

1401 MHz

Memory Clock
(Clock rate / Data rate)

837 MHz / 3348 MHz

924 MHz / 3696 MHz

Total Video Memory

1280 MB

1536 MB

Memory Interface

320-bit

384-bit

Total Memory Bandwidth

133.9 GB/s

177.4 GB/s

Texture Filtering Rate (Bilinear)

34.0 GigaTexels/sec

42.0 GigaTexels/sec

Fabrication Process

40 nm

40 nm

Connectors

2 x Dual-Link DVI-I
1 x Mini HDMI

2 x Dual-Link DVI-I
1 x Mini HDMI

Form Factor

Dual Slot

Dual Slot

Power Connectors

2 x 6-pin

1 x 6-pin, 1 x 8-pin

Max Board Power (TDP)

215 Watts

250 Watts

Recommended PSU

550 Watts

600 Watts

GPU Thermal Threshold

105° C

105° C

MSRP

$349 USD

$499 USD

 




 



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

An Article Based on PowerPoint Facts from NVidia
By GTVic on 3/26/2010 7:26:15 PM , Rating: 5
"Beats ATI soundly, but will you be able to buy it?"
"according to internal benchmarks shown to us by NVIDIA"


I stopped reading right there...




By th3pwn3r on 3/26/2010 7:35:00 PM , Rating: 5
I found that hilarious myself. NVidia and their benchmarks are a huge joke. I run an Nvidia powered machine myself and I wish I didn't to be honest.


By dubldwn on 3/26/2010 8:10:25 PM , Rating: 3
quote:
NVidia and their benchmarks are a huge joke.

http://www.dailytech.com/ATI+Launches+Radeon+HD+58...


By SteelyKen on 3/27/2010 4:28:43 AM , Rating: 5
Doesn't do much for AT's reputation to state something it's own review contradicts.


By Jansen (blog) on 3/27/2010 9:18:56 AM , Rating: 2
We weren't able to get our hands on a GF100 card this time around. Certainly would've liked to have done our own internal testing.


By Lifted on 3/27/2010 2:51:23 PM , Rating: 2
quote:
Somewhat surprising is the lack of support for DisplayPort

...
quote:
NVIDIA states that the GF100 supports DisplayPort


I see you were really struggling to get those 1000 (or whatever the magic number is) of words in.


RE: An Article Based on PowerPoint Facts from NVidia
By XZerg on 3/28/2010 10:43:06 AM , Rating: 2
how did the review then occur which was posted around the same time as the news if not earlier. I hope DT and AT being the same company share such information prior to getting such news up that has been so looked forward to. Even though mostly expected of what was to come. The sad part is DT and AT's predictions/conclusions on the product doesn't even match.

Seriously one company and this... despicable imho.


By Calin on 3/29/2010 9:24:37 AM , Rating: 2
Better this than the 99% votes some things attract. I'll live with people having different opinions.


RE: An Article Based on PowerPoint Facts from NVidia
By Rike on 3/29/2010 12:14:47 PM , Rating: 2
AT and DT are not the same company. They are owned and run by different people. I hate to quote wikipedia, but this information is so basic. The foundation of your complaint is baseless.

http://en.wikipedia.org/wiki/Anandtech

http://en.wikipedia.org/wiki/DailyTech


By The0ne on 3/29/2010 2:03:00 PM , Rating: 2
I had to LMAO when I read the wiki on those. DT has a news and blog section eh? And this is the news section? Made my day for sure :D


By GTVic on 3/28/2010 10:52:50 PM , Rating: 2
Not a good excuse for reprinting NVidia's PR. At least then call it a press release up front.


By meepstone on 3/26/2010 7:38:26 PM , Rating: 5
Yeah, I'm sure everyone loled when they read that line. I sure did.


By xsilver on 3/26/2010 8:00:24 PM , Rating: 3
Because of their 5-10% performance gain They'd be lucky to get 5-10% popularity - which should just kill their sales.

Unless their are still nvidia fanboys left?


RE: An Article Based on PowerPoint Facts from NVidia
By Samus on 3/26/2010 8:11:07 PM , Rating: 2
with nVidia's marketting department in overdrive, they don't even have to sell 'decent' products. every game i've played in the past TWO YEARS was an nVidia sponsored title.

mirrors edge, crysis warhead, borderlands...practically any EA title, which is almost all of them the way they gobble up everybody.

of course the nonsense for people like me is not being able to use the 'nvidia-specific' features like physx, which is bullshit because mirrors edge runs just fine with physx support for the first two levels then crashes because of an intentional memory leak.

physx should be a feature nVidia licenses to game coders, independant of what card you use. any GPU should be able to run CUDA instructions.


By xsilver on 3/26/2010 8:27:58 PM , Rating: 2
Im actually more worried about cartel type pricing - ati has no competition now so prices are still above rrp. With nvidia's lame effort/inability to price competitivly we could see prices above rrp for quite some time still?


By koenshaku on 3/27/2010 1:53:24 AM , Rating: 5
This thing should come with a coupon for a power supply, this card isn't elegant at all and you can probably OC your 5870 to get the same numbers. AMD will likely release a high clocked 5870 with 2gb of ram before this thing hits the streets and take it crown back before nvidia even gets a chance to prop it upon it's head.


By fuzzlefizz on 3/27/2010 3:19:29 AM , Rating: 2
If I can remember correctly, ATI/AMD doesn't sell cards directly anymore. It's all through 3rd party now like Powercolor, Sapphire, XFX, etc. An overclocked version of 5870 would really depend on those companies.


By alanore on 3/27/2010 9:10:22 AM , Rating: 2
AMD has the final say in what speeds their partners can factory overclock a card. Although it doesn't stop them making card with more overclocking headroom.

http://www.anandtech.com/video/showdoc.aspx?i=3746...


By Lifted on 3/27/2010 2:58:35 PM , Rating: 4
quote:
limiting AMD’s ability to profit


You highlighted the wrong part, and you apparently think "limiting" and "has not been able to" are one in the same, which is not the case.


By GTVic on 3/27/2010 6:33:12 PM , Rating: 3
That applied to initial availability. Since then TSMC fixed some of the major issues with the process. And AMD, which had previous experience with the process and planned accordingly, is now supposedly getting respectable yields.

NVidia on the other hand did not have prior experience before the chip design was finished in mid-late 2009 and is now supposedly suffering very low yields and large power requirements from Fermi.


By Mojo the Monkey on 3/29/2010 5:25:06 PM , Rating: 2
I agree. I was hoping this would be that big next step to send prices dipping... the last time I bought was when the $300 cards hit $189 in a hurry when the 4870 vs GTX 260/280 battle heated up.

I run 1920x1200, so I need something beefy to keep up with the latest titles at full res... but I wont pay over $200 and I wont upgrade until I know I will be wowed by the change.


Misleading Article
By TechLuster on 3/26/2010 7:36:55 PM , Rating: 5
quote:
Beats ATI soundly ...

quote:
However, performance is much stronger, with the GTX 480 and GTX 470 beating the Radeon HD 5870 and 5850 respectively, according to internal benchmarks shown to us by NVIDIA.

The reviews that have gone up so far (unfortunately Anandtech is down at the moment) seem to disagree. Quoting HardOCP (which actually did there own independent testing instead of relying on NVIDIA's internal testing),
quote:
The only thing that "blew us away" was the heat coming out of the video card and the sound of the fan.

Or HotHardware:
quote:
Versus the single-GPU powered Radeon HD 5870, the GeForce GTX 480 is on average roughly 5% - 10% faster.

So not only is performance not that much higher than ATI's best single-GPU cards (at least in current games which don't make use of NVIDIA's apparent tessellation advantage), but price and especially power consumption are higher. Finally, the 5800's have been AVAILABLE for half a year, and GTX 400's STILL aren't shipping yet.




RE: Misleading Article
By fifolo on 3/26/2010 8:11:41 PM , Rating: 5
5-10% against reference ATI cards, so with any degree of overclocking that would go away.

One could then say "well, overclock the 4xx card" but given that the card is already pushed to it's heat and power consumption limits, this is not a very real possibility.

I am a bit amazed at how gullible some tech sites have been, simply regurgitating nVidia's cherry picked unconfirmed benchmarks, where their high-end-trump-all-next-gen card can (in certain portions of certain benchmarks with certain settings) beat their competitor's mid range offering.

Really, I expect this kind of faithful fawning from, oh, let's say, sites that rhyme with "Bum's Hat Wear" but to post this silliness here is just, well, silly.


RE: Misleading Article
By whiskerwill on 3/26/2010 9:17:02 PM , Rating: 5
Even if it was 25% faster, I wouldn't buy it, not with those thermals. Since I got my new 32nm Core i5, I finally have a quiet system for the first time in 10 years. I'm not going to give that up for marginal video performance gains.


RE: Misleading Article
By Mitch101 on 3/27/2010 5:25:39 PM , Rating: 3
Silence and a computer room that doesn't feel like a sauna are golden.

What is with these resolutions and every benchmark having AA enabled? I would think the MAJOR MAJORITY of people are not running at 2560x1600 resolution. Were are the 1920x1080 benchmarks with NO AA? 1920x1200 is close but not mainstream monitor resolution.

Anyone else notice the NVIDIA GTX295 is faster than the GTX 480 in some instances? Could be driver maturity but its interesting and makes me think some component of the new architecture took a step back. Could be the reason the card isn't much faster is that some component isn't clocking up as high as it should.

Either way its good to see NVIDIA back with something and ATI prices have already started dropping I saw a 5770 1gig for $130.00 recently and thats a deal. Im using one to play Left 4 Dead 1 at 5760 x 1080 and drop it to 1920x1080 on more GPU intensive games.

I believe ATI's next gen GPU is expected around October.


RE: Misleading Article
By Keeir on 3/29/2010 2:13:00 PM , Rating: 2
quote:
Anyone else notice the NVIDIA GTX295 is faster than the GTX 480 in some instances? Could be driver maturity but its interesting and makes me think some component of the new architecture took a step back.


I am usually willing to spot a company a few driver revisions... but when a Product is -this- late its hard. I mean, are we expected to wait till Summer to finally have a card that meets it potential? If so, why not wait till Fall and get the AMD card which will likely be faster, cheaper, or more efficient... or all three.


RE: Misleading Article
By Sazar on 3/29/2010 3:53:53 PM , Rating: 2
The 295 is a dual-gpu solution so it stands to reason that it would perform pretty good in many respects. Note that the 295 actually competes with the 5970 in some benches as well.

Overall though, the GTX 480's performance is not terrible. The power, heat, noise and price will be the major issue for most folks.

I am intrigued to see what Nvidia CAN do with the card in the future on a smaller process and with all the SM's active. Given the scalability from the 470 to the 480, it could potentially be a nice performance boost with the additional SM.

Till then, I am sticking to my current card.


RE: Misleading Article
By CyborgTMT on 3/26/2010 9:59:26 PM , Rating: 4
quote:
I am a bit amazed at how gullible some tech sites have been, simply regurgitating nVidia's cherry picked unconfirmed benchmarks


What's worse is I've read a few 'reviews' that blatantly lie to make the 480/470 seem better than they really are. Not naming names here, but one site used a tweeked out 3rd party 5870 for price comparison, then a stock ATI one for benchmarks. Multiple others using rediculus resolutions or AA levels to amplify the difference in frame rates (tho not the actually performace %). Who buys a top end graphics card to play at 1650x1050 with no AA? OOHHH look the 480 puts out 30 more FPS than the 5870, just disregard the fact that it's still only a 15% increase.

Guess I now know a few sites from this:
http://www.dailytech.com/Pay+to+Play+Uncovering+On...


RE: Misleading Article
By inighthawki on 3/27/2010 1:16:46 AM , Rating: 2
quote:
Who buys a top end graphics card to play at 1650x1050 with no AA?

I do, if you want an example of someone who does. I enjoy fast performance and decent quality but quite honestly at 1680x1050 and higher, aliasing becomes more and more unnoticeable.

I prefer better performance over a quality increase that I RARELY notice when I'm really immersed in a game...


RE: Misleading Article
By CyborgTMT on 3/27/2010 1:46:22 AM , Rating: 2
If you can notice a difference between 160 fps compared to 130 fps in a game, you sir got skills!


RE: Misleading Article
By inighthawki on 3/27/2010 1:51:40 AM , Rating: 5
Since when do the latest games at 1680x1050 on max detail ever get 160fps? Not to mention benchmarks show averages, not minimums, which means that your framerate can in fact dip down low at certain points. That sudden stutter is more than enough to ruin the game for me, and i would choose no stutter than AA.


RE: Misleading Article
By CyborgTMT on 3/27/10, Rating: 0
RE: Misleading Article
By inighthawki on 3/27/2010 12:29:19 PM , Rating: 2
I guess what I'm trying to say is that I would much prefer guaranteeing a higher framerate which provides the less possibility of any stutter (considering I don't have a top of the line cpu, my framerates are not always as high as those in the benchmarks) as opposed to a graphical setting that I likely won't see any difference on.

I mean, how often do you REALLY notice aliasing at those resolutions? This isn't something like dynamic lighting, shadows, or texture quality which can show huge and noticeable differences...


RE: Misleading Article
By Calin on 3/29/2010 9:30:01 AM , Rating: 3
You won't notice a difference between 160 and 130 fps, but you'll notice a "minimum frame rate" difference between 45 and 25 fps (and in some cases, the minimum frame rate was better on the overall slower card).


RE: Misleading Article
By damianrobertjones on 3/27/2010 1:45:45 PM , Rating: 3
I play at 1650x1080 as that's the max and default of my monitor. Not everyone has a great big res monitor!!


RE: Misleading Article
By CyborgTMT on 3/27/2010 6:04:22 PM , Rating: 3
It's not about playing at 16x10 that I was upset about, it's the review sites that artificaly lowered the settings to create a larger discrepency between frame rates. I said in another comment one of my systems is on a 1680x1050 monitor. I would be willing to bet that the majority of gamers are also on 20-22" monitors with the same rez. But if you are going to do a review don't put a game at the lowest settings at a resolution that these card can easily max out. The main site that this bugged me on did a 'quality' review of both at the 3 main flat screen resolutions. Running the games in low, medium and high settings, but then in the comments/summary section stating that (paraphrasing) 480 was far better choice because it out performs the 5870 at times over 30 fps. Yah, 30 fps at lowest settings on the lowest resolution. They fail to point out in the same summary that with game quality settings on high the difference is 5-10 fps. Yah, technically you can say the information is in the review, but it's not what they cherry pick to put in the headline or in the summary sections.

I guess in the end yesterday I was just very frustrated on finding a ballanced review of the cards so I could make an informed buying decision for my summer build.

BTW, I also love the sites using out-dated drivers (some as far back as last year) for the 5870 then defending their choice the next day as their forums fill up with call of BS.


RE: Misleading Article
By inighthawki on 3/27/2010 10:07:25 PM , Rating: 2
I see your point, I noticed this too. The 480 doesn't scale as nicely as the 5870 on higher resolutions. Yes it is still better bu the difference between the two cards begins to disappear. The true test, however, is maxed out settings (AA optional) at the most commonly used resolution. That is the scenario that they should base their conclusion off of. Anyone who uses a different resolution can then easily skim through the charts to see how it compares.


RE: Misleading Article
By The0ne on 3/29/2010 2:06:22 PM , Rating: 2
You don't seem to be too aware of the relationship DT and AT has with Nvidia. I would recommend reading up on current and past comments/reviews to see how they take Nvidia's "suggestions." :)


RE: Misleading Article
By AnnihilatorX on 3/27/2010 9:01:27 AM , Rating: 2
quote:
The only thing that "blew us away" was the heat coming out of the video card and the sound of the fan.
HardOCP

quote:
Because designing GPUs this big is "fucking hard"
NVIDIA’s VP of Product Marketing Ujesh Desai on GF100

As soon as I read Anandtech's benchmark prologue
That made my day


Misleading!
By kroker on 3/26/2010 8:18:50 PM , Rating: 5
Wow, the performance slide is either incredibly misleading or downright lying.

Example: According to Anandtech, performance in Dirt 2 is:
1920 x 1200, 4X AA, DX11: HD5870 - 71 fps, GTX 480 - 87.3 fps (22% faster than the Radeon)
2560 x 1600, 4X AA, DX11: HD5870 - 52.1fps, GTX 480 - 56.1 fps (7.7% faster than the Radeon)

You can see the results here: http://www.anandtech.com/video/showdoc.aspx?i=3783...

According to the above slide, GTX 480 is about 60% faster at 1920 x 1200 with 4X AA, and about 50% faster at 2560 x 1600 with 4X AA. So... are they lying?

But then I went to the Hexus.net review, and they tested Dirt 2 both with DX 11 support and with DX 9 support. The performance in DX 9 is much higher than in DX 11! (http://www.hexus.net/content/item.php?item=24000&p... ). In fact, the performance is pretty much in line with the above slide.

So, Nvidia seems to be using DX 9 scores on a slide called "Unmatched DX 11 performance"!!! Wow, these guys are just too much!

One other thing: Nvidia claims 250W TDP, which should be 62W more than HD 5870. But according to Anandtech, in Crysis, GTX 480 consumes 102W more power than HD 5870 (http://www.anandtech.com/video/showdoc.aspx?i=3783... ), for just 10-15% more performance.




RE: Misleading!
By kroker on 3/26/2010 8:30:22 PM , Rating: 2
Sorry, I made a little mistake. The slide is relative to the performance of HD 5850, so the slide only indicates that Dirt 2 runs about 40-45% faster in 1920 x 1200 and about 30% faster than the HD 5870 in 2560 x 1600. Still, these figures are completely consistent with the DX 9 results from Hexus: ~46% and ~27%, respectively.


RE: Misleading!
By B3an on 3/27/10, Rating: -1
RE: Misleading!
By yomamafor1 on 3/29/2010 8:28:34 AM , Rating: 4
You mean things that Radeons are also capable of (especially OpenCL), but only held back by the softwares?

The Radeon 4870 GPU has a computing power of 1.7 Tflops, while the GTX280 can't even hit 1 Tflop. The newest 5870 has 2.72 Tflops (the GTX480's performance on the other hand, is withheld by Nvidia). It really seems Randeon has a lot of untapped power.


RE: Misleading!
By dubldwn on 3/26/2010 8:38:38 PM , Rating: 3
quote:
According to the above slide, GTX 480 is about 60% faster at 1920 x 1200 with 4X AA, and about 50% faster at 2560 x 1600 with 4X AA. So... are they lying?

No, they're not. Well, maybe, but not in the way you're describing it. They cut off the graph at 0.80 using the 5850 as a reference for 1.00. So, you have to imagine the graph going all the way down to 0.00. It's misleading, just like the graphs ATI releases.


RE: Misleading!
By kroker on 3/26/2010 8:51:07 PM , Rating: 2
I know the bars start at 0.8, I wasn't talking about that.

I did make a small error, as stated above. In Dirt 2, the GTX 480 performance bar indicates about 1.6 @ 1920x1200, and HD 5870 indicates about 1.1 - 1.15, so about 40-45% difference between them. And @ 2560 x 1600, the bars indicate ~1.5 for GTX 480, and ~1.2 for HD 5870 (funny that it seems to be a little faster at this resolution), so the difference between them is ~25-30%.

At least for Dirt 2, Nvidia is using DX 9 scores in slide which is supposed to indicate DX 11 scores!


RE: Misleading!
By Calin on 3/29/2010 9:35:19 AM , Rating: 2
I think those scores were taken with and older version of the demo (which didn't ran in DX11 mode on the NVidia cards).


RE: Misleading!
By kroker on 3/29/2010 7:22:44 PM , Rating: 2
Even if that's the case, it doesn't explain why they show those scores in a slide which claims to indicate DX 11 scores. My point still stands - it's lying and misleading.


RE: Misleading!
By JumpingJack on 3/28/2010 11:31:37 PM , Rating: 3
I always find it utterly despicable when companies start the scale at an arbitrary non-zero point to yank what is really a small difference into an illusion of a large difference.

nVidia is horridly guilty of this, ATi does it to... heck even AMD (http://www.firingsquad.com/hardware/amd_phenom_2_9... ... Looking for an Intel slide that does it, haven't found one yet but I am sure it's happened.

Just disgusting.


RE: Misleading!
By The0ne on 3/29/2010 2:13:05 PM , Rating: 2
This is the case when Sales/Marketing and sometimes Bosses get their hands on the "actual" results. Hell, my previous requested I do MTBF calculations differently so the numbers come out nice and good when he presents it to the execs and ultimately to investors and consumers.

This is why with technology, if you can, perform the tests yourself to see what really is going on. If you can't find a VERY reliable source to obtain your information from.


NVIDIA will lose money
By Phoque on 3/26/2010 10:04:10 PM , Rating: 4
This generation around, NVIDIA will lose big money and also I guess some market share. 60% bigger chip for 5-10% better performance is a complete disaster from a gamer product point of view ( maybe for GPGPU it's an awesome product and will sell well and at big profit in quadro and other specialized markets ).

This Fermi crap is doomed. The next generation though, it should be The Way It was Meant To Be Designed and have success.




RE: NVIDIA will lose money
By Shig on 3/26/10, Rating: -1
RE: NVIDIA will lose money
By Murst on 3/26/2010 11:51:07 PM , Rating: 2
It might not be so rosy for Nvidia in this market segment in the near future.

At this point, sure, nothing comes close, but as both AMD and Intel start integrating GPU-like cores into their standard CPU lines, the market will probably shift. Sure, on a per-chip basis, Tesla will outperform them, but the price/performance difference may not be so clear.

Also, if this market is so huge, why didn't Cell ever take off? My guess is that the demand isn't as high as you believe it is.


RE: NVIDIA will lose money
By fuzzlefizz on 3/27/10, Rating: 0
RE: NVIDIA will lose money
By Kim Leo on 3/27/2010 12:55:29 PM , Rating: 3
quote:
Intel's Larrabee project was cancelled due to delays and disappointing performance figures. I would assume AMD is on the same boat.


Wait, How is, Intel trying to make a x86 based GPU have anything to do with AMD using the technology they aquired by merging with ATI? And last I checked the raw processing power of HD58XX cards is high enough for them to follow nVidia into the HPC market.


RE: NVIDIA will lose money
By fuzzlefizz on 3/27/2010 4:28:59 PM , Rating: 1
My statement has nothing to do with raw processing power of ati or nvidia's card. Just a statement of the current situation of the GPGPU market.

Intel's Larrabee has nothing to do with AMD's merger with ATI. Larrabee was supposed to be Intel's competitive offering to what AMD/ATI and Nvidia has, and possibly competition to IBM's cell mainframe.

Doing some digging, AMD's Fusion is still around and is the same idea. However, Intel's Larrabee and AMD's Fusion is just their response to IBM's HPC and nVidia's HPC offering. Everyone's trying to compete one way or another.

Of course, ATI also has Firestream. One product competing with nVidia and Fusion as a response to Intel's Larrabee.


RE: NVIDIA will lose money
By Penti on 3/27/2010 9:34:43 PM , Rating: 4
AMD Fusion (implementations) has nothing to do with discrete graphics and nothing to do with the HPC market. Fusion is just mainstream/mobile stuff. Just as larrabee has nothing to do with X4500 integrated graphics. Moving it to the cpu doesn't give you any more gpgpu power.


Catalyst 10.2!
By Phoque on 3/26/2010 9:53:13 PM , Rating: 4
The slide mentions catalyst driver 10.2. WHQL driver catalyst 10.3 improves performance anywhere from 2% to 20%!

http://www.tomshardware.com/news/gpu-catalyst-driv...




RE: Catalyst 10.2!
By Phoque on 3/26/2010 10:06:14 PM , Rating: 2
Got to give it to Nvidia, they didn't used their most recent driver either and it improves substantially their performance too.


RE: Catalyst 10.2!
By Goty on 3/27/2010 3:27:30 AM , Rating: 2
Ah, so you're the only person outside of NVIDIA to have access to multiple driver versions that support this card, eh?


Is this correct?
By Murst on 3/26/2010 10:42:50 PM , Rating: 2
From the article:
quote:
but users who wish to use more than two monitors at the same time will be required to use a second card

According to Anandtech:
quote:
Shifting gears to the consumer side, back in January NVIDIA was showing off their Eyefinity-like solutions 3DVision Surround and NVIDIA Surround on the CES showfloor. At the time we were told that the feature would launch with what is now the GTX 400 series, but as with everything else related to Fermi, it’s late.

Neither 3DVision Surround nor NVIDIA surround are available in the drivers sampled to us for this review. NVIDIA tells us that these features will be available in their release 256 drivers due in April. There hasn’t been any guidance on when in April these drivers will be released, so at this point it’s anyone’s guess whether they’ll arrive in time for the GTX 400 series retail launch.

Perhaps I'm not understanding the Anandtech quote properly, but I always thought that Eyefinity was the 3 monitor support, and after reading that in the Anandtech review, I assumed that Nvidia will in fact support 3 monitors on a single card when it does come out.




RE: Is this correct?
By Goty on 3/27/2010 3:29:50 AM , Rating: 2
The anandtech quote doesn't state anything about the number of cards needed.


RE: Is this correct?
By Jansen (blog) on 3/27/2010 9:17:02 AM , Rating: 4
Anandtech's Ryan Smith:

"Bear in mind that GF100 doesn’t have the ability to drive 3 displays with a single card, so while there are 3 DVI-type outputs here, you can only have two at once."


website issue..
By CyborgTMT on 3/27/2010 6:11:04 PM , Rating: 4
quote:
This website has been reported as unsafe
www.dailytech.com

We recommend that you do not continue to this website.
Go to my home page instead

This website has been reported to Microsoft for containing threats to your computer that might reveal personal or financial information.

More information

This website has been reported to contain the following threats:

Malicious software threat: This site contains links to viruses or other software programs that can reveal personal information stored or typed on your computer to malicious persons.

Learn more about phishing
Learn more about malicious software
Report that this site does not contain threats
Disregard and continue (not recommended)



Keep randomly getting this from DailyTech and occaionaly on AnandTech. I keep reporting site as safe, but might be an advertisment packing in something extra....




RE: website issue..
By hadifa on 3/27/2010 9:57:25 PM , Rating: 2
I get the unsafe message in both Firefox and Chrome every time I try to click on a link in Anandtech.


RE: website issue..
By leexgx on 3/29/2010 12:31:09 PM , Rating: 2
seem more like an DNS malware or just malware on the pc (can be an rootkit virus that hides system drivers can be the problem as well)

run malwarebytes and spybot if you can or last resort combofix (say no to the install console thing when it asks)


Performance per watt
By Kepe on 3/27/2010 9:33:10 AM , Rating: 5
I sent Anandtech an email, here it is:

Hi!

Thanks for the very insightful review of the new product from nVidia.
As the GTX 480 is a very power hungry card, I did some performance
per watt calculations to see how it compares to the ATi Radeon HD
5870. I think you might want to do these calculations by yourself
and add them to the review, because in my humble opinion the
results are very interesting.

I took the power draw measured in Crysis and compared it to the
FPS the cards were capable of. Here are some results:

Single gpu: (higher is better)

GTX 480: 0,067 fps/W
GTX 470: 0,062
HD 5870: 0,081
HD 5850: 0,082

The HD 5870 has 21% better fps per watt performance than the
GTX 480, which is a pretty big margin if you ask me.

Dual gpu:

GTX 480 SLI: 0,083 fps/W
HD 5870 CF: 0,089
HD 5970: 0,098

Dual gpu results look a lot better because the power consumption
is measured on the whole system level. Adding a second gpu
nearly doubles the performance, but only adds the power draw caused
by the second gpu / graphics card to the power draw of the system.

Keep up the good work at Anandtech!
You're my favorite IT site.

- Aleksi




RE: Performance per watt
By Taft12 on 3/29/2010 1:19:00 PM , Rating: 2
Congrats on being rated up to a 5, but performance per watt is just about completely meaningless if we are talking about gaming performance (which you are).

For CPU benchmarks, it makes sense to measure how much electricity is used to complete a discrete task, such as converting a video or rendering a scene, but gaming is not a discrete task.

NVIDIA's performance is indeed dreadful when you factor power consumption, but I really don't see much value in the calculations you have done. A low-end video card might achieve a very high FPS/W, but it would not matter if it couldn't provide a playable frame rate.


SLI - Is that even possible
By BZDTemp on 3/27/2010 1:42:12 PM , Rating: 2
I mean has anyone seen a site that has two cards so they could test SLI and debunk/confirm the Nvidia scaling claims?

Also am I the only one finding it funny how Nvidia is all about single GPU performance as if it somehow mattered to the end user how many chips are on board. Nividia must be really desperate.




RE: SLI - Is that even possible
By arjunp2085 on 3/27/2010 2:51:43 PM , Rating: 1
Man that is one HOT Card..... Toooo HOT to Handle...

Nvidia may be increasing the sales of more PSU than GPU!!!!

The way it is meant to be HEATED... :)

By the way is this card in Spec with PCI-e standards Under LOAD??


RE: SLI - Is that even possible
By arjunp2085 on 3/27/2010 2:51:54 PM , Rating: 1
Man that is one HOT Card..... Toooo HOT to Handle...

Nvidia may be increasing the sales of more PSU than GPU!!!!

The way it is meant to be HEATED... :)

By the way is this card in Spec with PCI-e standards Under LOAD??


By AnnihilatorX on 3/27/2010 4:01:42 PM , Rating: 2
anandtech had SLI benchmarks


Apples to Oranges
By stromgald30 on 3/26/2010 7:25:04 PM , Rating: 3
nVidia's comparisons of the GTX480 vs 5870 is a bit flawed. They're assuming that ATI won't make any price drops when Fermi hits shelves, which is a bit presumptuous considering the price inflation on ATI cards due to lack of competition.

IMO, the 5970 will probably drop to the GTX480's level and the GTX470 will have to compete with the 5870, against which it only has a slight advantage.

The drop in prices would also put the power consumption numbers in perspective since the 5870 is spec'd at 188W, which is comparable to the 215W of the GTX 470, but not the 250W of the GTX 480.




RE: Apples to Oranges
By leexgx on 3/29/2010 12:37:50 PM , Rating: 2
correction there as you seem to looking at it backwards

the GTX470 needs to match the 5850 prices or only be an little bit higher

GTX480 should be priced just an little bit higher then the 5870

the 5970 is norm faster then 1 GTX480 all the time, the GTX480 should be lower then the price of an 5970


Hot!
By Kyanzes on 3/28/2010 4:25:18 AM , Rating: 2
Where no man has boiled an egg before!




RE: Hot!
By Kaleid on 3/30/2010 7:03:16 PM , Rating: 2
Presentiiiiing Nvidia Thermi

Seriously though. When companies start to say things like "it's really fast with this sort of calculation [tesselation] and that it's not just a GPU then the warning signs should be ringing.

It's not the best car, but the seating is unbeatable!


ATTACK of the Fan Boys!!! LOL
By soda777 on 4/24/2010 10:25:04 PM , Rating: 2
When reading through comments I'm finding a majority to be insecure ATI fanboys with their panties in a twist over the new cards, lol. No offense to the fair people, but where are the REAL discussions about the new cards? And by real, I mean discussions not directed by agendas for one company or another. I get it. There's no real advantage for someone owning a recent ati card to buy these cards because they aren't that much better. But for someone just shopping, the picture could be different. So far it seems 90% of everything said are gross exaggerations based on fears or speculation of what might happen. Or because someone who owns an ati card is reassuring themselves that their card is still good. And that's just worthless. I've even seen a video showing a nuclear holocaust to depict what would happen if these new cards were used. Funny, but useless.

And many argue it will cost more on electric, but without noticing that even if you ran it 4 hours a day, 365 days a year, meaning you had no life, it wouldn't cost more than an average $8 a year more to run. Yet these same money "misers" complain that you can't run 6 monitors off the same card as if most are going to buy six. They also complain about the heat even though they are not quite sure just how much more heat as they quote only the highest, cherry picked and speculated worst case scenarios.

There isn't a problem with saying the nvidia isn't a huge boost over the ati. Or that it's a bit warmer. It's true. The problem I see is fans going out of their way to only reiterate only the most negative view points to imply the card isn't good for anyone and no benefits could be possible. It's not as if they've shown up to determine if it's a reasonable solution to gaming, but rather to speculate it's a failure. They complain that there isn't enough official information to say the card is faster and "laugh" at that speculation, but at the same time feel free to blast the card with nothing more than speculation. And they don't want allow any positive speculation for nvidia, but they feel free to take all negative speculation to heart and believe it all. THAT is what a insecure fanboys do and I've seen the fanboy game a 1000's times before, so I know.

But ever wonder what would have happened if ATI released a card similar to the GTX 6 months ago? With no competition from nvidia, fans would have given it a shot and it probably would have worked just fine in their gaming rigs. They'd realize it was a much more powerful card and it would all make sense. Then after 6 months of anticipation, Nvidia would release a card like ati's and it would be a SLOWER card! Even though it was the same as radeons latest cards, it would probably be the JOKE of the internet. All the ati fans would talk about how nvidia dropped the ball and couldn't create something as fast as the ati. And they wouldn't care about a litte more warmth because they see it working in their rigs. Then the nvidia fans would show up and say, "But wait! The nvidia uses less power and is a little cooler!!". And the ATI fans would NOT be impressed. They'd call them retarded, green, tree huggers with a lame excuse from nvidia. And say, "No thanks, but I don't want a SNAIL for a card. I'll keep my 5870".

The problem is there are few articles that attempt to see if the 470 is reasonable enough or not. And a few even seem to be written by ati fans. Or by those jaded by the fact that they already have a 5870 etc. Either it works in a reasonable way or it doesn't. And also there are reviews that are simply too nice to nvidia but don't show anything much. I want to know if this card is reasonable when placed in a typical gaming rig. NOT a bunch of speculation or fear mongering. If it only is a little warmer and functions properly in a nice case like an antec 900, then it could be a good card. And based on reviews of people who got some of the 1st ones, they've been saying the "problems" they were warned about are over rated. Again, where are the REAL, unbiased reviews? And not just nvidia graphs!! Just put one in a CLOSED gaming rig and show a game running and what it's like without special efforts to make it look bad! Again, an unbiased fair review.




RE: ATTACK of the Fan Boys!!! LOL
By soda777 on 4/24/2010 10:40:08 PM , Rating: 2
I'll also add that I know someone will likely quote some huge temp extreme they read about. And I've seen a couple high numbers, but I've also seen a few random reports saying the temp extremes have been over rated and were lower. So, again that's why I want more real reviews, running real games, with fair ambients, showing some real temps. The more the better. And so far they've been very few and far between.. .


HDMI - Almost useless?
By mavricxx on 3/28/2010 6:44:36 PM , Rating: 2
"Instead, the reference design has two Dual-link DVI ports and an almost useless mini-HDMI port."

What are you talking about? HDMI is WAY more popular than Displayport! I have yet to see a Displayport monitor (or worthy one at that). But I do wish they woulda done like ATI in their 5870 cards where they had an HDMI, Displayport and dual DVI connectors.




system review
By hlper on 3/29/2010 10:04:34 AM , Rating: 2
I actually noticed this on cnet late last week. This system has the new 480 card in it, and the one thing that the reviewer mentioned was that the heat seemed like they might be hot enough to limit hardware lifespan, and the card was too hot to touch after only brief gaming.

It's not like the ATI cards are cool to the touch, but I would be a little concerned building my own system with these NVidia cards.

http://reviews.cnet.com/desktops/velocity-micro-ed...




By jpr703 on 3/30/2010 12:53:46 PM , Rating: 2
After losing an expensive Dell laptop due to heat issues with an Nvidia 8600M GT, I won't even be looking at another Nvidia card. Companies that knowingly sell junk just aren't worth considering, no matter how tempting their current line of BS is.




Not a bad showing
By Parhel on 3/26/10, Rating: -1
RE: Not a bad showing
By kroker on 3/26/2010 10:04:30 PM , Rating: 5
They are both duds. Power consumption is through the roof, considering they use 40nm chips. Actually, GTX 470 seems a little more reasonable than GTX 480.

quote:
I still think the 5870 is the better buy overall, and frankly I wouldn't buy a card as hot / power hungry / loud as the GTX480, but this is nowhere near the total disaster I was expecting.


HD 5870 is 10% slower on average, 20% cheaper, at load it consumes 102W less power, it's 17C cooler and 5dB quieter, and it's AVAILABLE. Is there really any objective reason to buy one of the new Nvidia cards, except maybe CUDA/OpenCL for those who need it? GTX 480 is exactly the disaster I was expecting, which is bad news for all of us except maybe Charlie Demerjan and ATI fanboys.


RE: Not a bad showing
By fuzzlefizz on 3/27/10, Rating: 0
RE: Not a bad showing
By dustwalker13 on 3/27/2010 10:07:09 AM , Rating: 4
umm ... the 5970 is faster than the 480. if anything you would have to buy two 480s to top your current performance and then you are looking down the road of spending around 1k for your graphics solution ...

mind you, you probably could cut down on your expenses for heating your apartment/hous in winter, that would soundly be done by your pc then :P


RE: Not a bad showing
By Penti on 3/27/2010 9:25:44 PM , Rating: 3
Plus they could probably release a HD5890 card that will match GTX480. If they want to. Maybe a 2GB HD5870 will be enough. HD4890 was like ten percent faster then HD4870...


RE: Not a bad showing
By Orac4prez on 3/28/2010 6:18:38 PM , Rating: 2
How can you run CUDA on these cards? I run some BOINC projects and a quantum mechanics program which use CUDA. The calcs on a quad core take more than 24 hours now and the existing Nvidia cards crash when the weather is hot. I would need a dedicated water cooling system just for the graphics cards to keep them running! Unless the heat issue is fixed very soon they will kill off this "advantage".


RE: Not a bad showing
By Parhel on 3/29/2010 2:53:38 PM , Rating: 2
quote:
GTX 480 is exactly the disaster I was expecting


Maybe my expectations were just lower than yours. I'm wasn't sure they would even be able to release a product this cycle. Really, I'm still not sure considering no one knows what availability will look like.

If it actually hits stores, I don't see the 480 as a total failure. It would probably be more appropriately priced at $399, though, considering AMD's lineup. To paraphrase a recent Anandtech article, there are no bad video cards, just badly priced video cards. Then, against the 5870, the tradeoff would be increased heat, noise and power draw for a 10% or so performance increase and PhysX.

Personally, I try to buy at the high end and keep my cards for around three years, so I don't think I'd trust a card with those temps for that long, nor would I be able to deal with the noise. I'd opt for the 5870 almost regardless of price.

But I still think this architecture might work at 28nm. They would have to execute on all fronts to be sure, but if they could increase yields, increase clocks, enable the full 512SP, fix whatever it is that's holding back memory clocks, and reduce the power requirements, it might a winner next year. It's a big list, but I'm still hopeful.


RE: Not a bad showing
By FaceMaster on 3/27/10, Rating: -1
RE: Not a bad showing
By AnnihilatorX on 3/27/2010 3:42:09 PM , Rating: 5
Old jokes aren't funny anymore. In Sovient Russia, Crysis runs you


RE: Not a bad showing
By FaceMaster on 4/1/2010 2:01:02 PM , Rating: 2
quote:
Old jokes aren't funny anymore.


Your Mum's still funny


"We basically took a look at this situation and said, this is bullshit." -- Newegg Chief Legal Officer Lee Cheng's take on patent troll Soverain














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki