backtop


Print 71 comment(s) - last by EricMartello.. on Nov 12 at 9:31 PM


NVIDIA's Tom Petersen "raises the roof" for NVIDIA's new graphics card.  (Source: YouTube/NVIDIA)

The new cards will adopt a vapor chamber design, which will offer quieter, cooler operation.  (Source: YouTube/NVIDIA)

With the cards, NVIDIA will also be pushing multi-layer tesselation as the next big thing in gaming.  (Source: YouTube/NVIDIA)
It's not easy being green, but NVIDIA is preparing its counter-punch to AMD's 6000 series

The second chapter of the DirectX 11 wars may soon be written. 

AMD is in the process of refreshing its 40 nm Evergreen GPUs, with a new family of GPUs dubbed "Northern Islands".  The first Northern Islands hardware -- the Radeon HD 6850 and Radeon HD 6870 -- has launched, belonging to the budget-friendly Barts subfamily.  A dizzying array of other product subfamilies are also reportedly incoming -- Antilles, Caicos, Cayman, Turks, Blackcomb, Seymour and Whistler.  The one to probably keep your eyes on most closely is the Antilles series, AMD's high-performance line.  AMD's aims for single-GPU supremacy rest on the Radeon HD 6990, an Antilles card set to launch before the end of the year.

NVIDIA was late to the gate during the last round, and its numerous 400 series delays ultimately cost it the lead in the discrete graphics market.

This time around NVIDIA hopes to counter AMD, much more quickly as it is reportedly preparing to release the Geforce 500 series, its own 40 nm refresh of the Geforce 400 series. 

A couple of weeks ago NVIDIA briefly posted the name of what will presumably be one of its first discrete GPUs in the lineup -- the Geforce GTX 580.  This week NVIDIA was busy (officially) showing off the new GPU running a nifty multiple tessellation map demo and the new Call of Duty: Black Ops.  It was also revealed during the demonstration that the card has a vapor cooling shroud.

NVIDIA says the new vapor shroud cuts the noise levels by 7 decibels and allows the system to run cooler.  The shroud operates similar to the shrouds Sapphire has used for some time on its AMD Radeon GPUs -- it's basically a sealed liquid cooling system.  A coolant liquid in the shroud circulates over the hot GPU, picking up its thermal energy.  The vaporized liquid travels to the fan, cools off, condenses, and then is recirculated, completely the circle of (cooling shroud) life. 

With the new shroud NVIDIA hopes to end its noise woes, returning to Geforce 200 series levels.  Of course that improvement will likely come at a cost to NVIDIA's bottom line, as vapor shrouds certainly command a premium over traditional coolers.  NVIDIA clearly is still struggling with heating issues, so rather than utilize a noisy, high flow fan like the 400 series, this time around it is opting to pay a bit more for a nicer solution.  Will that be reflected in the price?  We shall see.

When it comes to performance, though, NVIDIA is unequivocal in its belief that it will reign supreme.  At the teaser event, the company's Director of Technical Marketing, Tom Petersen brags, "This is the fastest DirectX 11 GPU on the planet."

If those statements are accurate, and NVIDIA is able to the launch its new models quickly, it stands to regain much ground on AMD, assuming competitive pricing.  Of course NVIDIA was optimistic about launching the 400 series in 2009, but reality eventually became an April 2010 launch, so don't count your GPUs before they hatch. 

NVIDIA is clearly feeling the heat from AMD as its slashing the prices on its mid-range GPUs.  The company writes:
As you are likely writing about the upcoming Radeon 6800 series launch, we felt it was important that you're up to date on the latest GeForce GTX 400 series pricing.
We'd like to inform you of new suggested retail pricing (SEP) for one of our most popular GPUs, the GeForce GTX 460 1GB. The new SEP for the GTX 460 1GB is $199.99. As always, the SEP is just a suggestion and you'll likely find retail boards from our partners at multiple price points. We expect many standard boards to sell in the $180s-$190s, and OC boards to sell for $209+.

In addition to the GeForce GTX 460 1GB, we'd also like to mention new pricing on the GeForce GTX 470. The SEP of this GPU is now $259.99. The GeForce GTX 470 offers more tessellation engines than GTX 460, making it ready for the most demanding DX11 games on the market today, and just as important, the games of tomorrow.
And responding to accusations from AMD that the price cuts were temporary, NVIDIA replies:
Further to our e-mail last night about the GeForce GTX 460 1GB/GTX 470 price adjustment, please rest assured that our price adjustments are in fact permanent. Any claims that our pricing update is temporary are patently false.
So are these price cuts another sign of an impending Geforce 500 launch and the official start of round two of the DirectX 11 wars?  Or are they just a desperation tactic as NVIDIA grapples with a new round of delays?  The answer to that should be apparent in weeks (or months) to come, once we see when the new vapor-equipped Geforce 500 hits the world markets.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

and at $600 so what
By kattanna on 11/8/2010 10:49:41 AM , Rating: 4
from all i have read the new card will be selling for $600

good luck with that




RE: and at $600 so what
By MeesterNid on 11/8/2010 10:54:45 AM , Rating: 2
...and will probably be the size of a small car.


RE: and at $600 so what
By FITCamaro on 11/8/2010 12:50:02 PM , Rating: 5
No thats the power supply you need to run it.


RE: and at $600 so what
By JakLee on 11/8/2010 7:13:20 PM , Rating: 3
This just announced - Corsair will be releasing a "special" 580GTX "SLI" edition 5800 watt power supply


RE: and at $600 so what
By EricMartello on 11/12/2010 9:31:17 PM , Rating: 2
That's the "Green tech" at it's best...now we're measuring PC power supplies in kilowatts...and most of that power goes toward playing games. heh

I really liked the 8800GT back in the day because it didn't need multiple slots, it ran cool and was relatively quiet...and for its time it was close to the top in terms of performance.

I'd like powerful GPUs that don't require a massive cooling apparatus attached to it.


RE: and at $600 so what
By majorpain on 11/8/2010 4:31:49 PM , Rating: 2
imagine the size of the PSU...


RE: and at $600 so what
By Breathless on 11/8/2010 10:55:29 AM , Rating: 3
that is common for the top of the food chain cards when they first come out. They will probably make plenty of sales.


RE: and at $600 so what
By omnicronx on 11/8/2010 3:31:42 PM , Rating: 1
You seem to be missing the point, the heatsync will always cost more, whether the product is new or at the end of its life cycle.

Either Nvidia is taking the hit to its bottom line, or its passing it off to the consumer. Either way, someone loses.


RE: and at $600 so what
By GuinnessKMF on 11/8/2010 10:56:45 AM , Rating: 2
You post as if people will actually not buy it based on price. Some people just don't sneeze at that when it comes to GPU performance.

Keeping the "crown" seems to be nVidia's main goal lately, and I'm sure there will be people to buy them. Sensible buyers will go for a bit more modest of a card, and benefit from these cards eventually, when the prices fall.


RE: and at $600 so what
By mcnabney on 11/8/2010 12:25:57 PM , Rating: 1
$600 will also buy an Xbox 360 and a PS3.

And PC enthusiasts wonder why the market is owned by consoles.

/PC enthusiast


RE: and at $600 so what
By B3an on 11/8/2010 1:05:32 PM , Rating: 5
You get what you pay for. Dont like it? dont be an "enthusiast".

Around $250 will buy you a graphics card thats far more powerful than anything the consoles have.

You dont have to buy expensive cards or CPU's to enjoy better than console graphics.


RE: and at $600 so what
By Bubbacub on 11/8/10, Rating: 0
RE: and at $600 so what
By priusone on 11/8/2010 11:30:27 PM , Rating: 1
Yeah, because I hate playing Fallout 3 and New Vegas. A buddy of mine just built a new system and looking at the PS3/360 screen shots vs his is simply amazing. The last system he built was back in 2005 and except for Crysis and a few other games, he had no complaints. His 2005 system has been hacked into a media server and a media center (took his pc and an old one that I had and moved around cards and such). He hates using a controller, as do I, but I still do play with my PSP from time to time.

Sure, there are no more games being made for the Apple IIE, but as far as major companies making games for those of use who enjoy PC gaming, well, I disagree with your 'slow death' concept.

And for the record, my GPU is an HD 4670, which is about as green as GPU can be, unless you would actually consider and Intel GMA a GPU. (Intel GMA not being a GPU = sarcasm)


RE: and at $600 so what
By glennc on 11/9/2010 5:27:58 PM , Rating: 2
fallout3 and vegas are based on a very old gaming engine. you will only see a resolution increase with that game. a better example would be a directx 11 game. just increasing the resolution on an old gaming engine does not improve the experience. i can play the same games on my 50" plasma, with my ass on my couch... now that improves the experience!!!


RE: and at $600 so what
By inighthawki on 11/8/2010 1:59:18 PM , Rating: 2
The problem though is that because all consoles use the same hardware, console games can be far more optimized than PC games, meaning to get the same performance on a PC, you technically need stronger hardware. But I do agree, a $600 card is still way beyond what you would need to accomplish such a task.


RE: and at $600 so what
By omnicronx on 11/8/10, Rating: 0
RE: and at $600 so what
By angryplayer on 11/8/2010 7:15:20 PM , Rating: 5
No no no... Games are being rehashed on the console to milk successes on the PC. The big "blockbuster" titles? COD? MOH? They started on the PC.


RE: and at $600 so what
By michael67 on 11/8/2010 2:11:50 PM , Rating: 2
Don't think you can play games on a console the same way as whit a PC

http://tweakers.net/ext/f/Y8fXr00Gbf9JtUcD0nxGXFWi...

This is playable whit one 5870, and supreme whit two of them, like to see some one do that on a console ^_^


RE: and at $600 so what
By bhieb on 11/8/2010 4:26:33 PM , Rating: 2
You didn't just post WOW as an example of EXTREME gaming did you? WOW is designed for the masses, and as such, does not require much horsepower. It's made to run on laptops.

Certainly turning up the settings can stress some cards, but it is not really a good game to prove your point.

Now show me Crysis running at full setting on those 3 screens, then I'd be impressed. After all last I heard I've never seen a post that said "But can it play WOW!!"


RE: and at $600 so what
By BZDTemp on 11/8/2010 4:29:49 PM , Rating: 2
So?

When you have a 360, a PS3 (and a Wii and a PSP) then what matters is if the same money can get you something you want/need more.

Also the $600 is cheap compared to what hardware cost just a little while ago. I remember my first 16 bit sound card was $600!


RE: and at $600 so what
By VitalyTheUnknown on 11/8/2010 11:09:02 AM , Rating: 5
Commentary on "engadget" -

I wonder if it (allegedly, GF GTX 580) will require 2 1000W PSUs, a dozen 8-pin connectors and headphones, so you don't hear the fan...
-
I suggest they use the heat from the GPU to power a small steam engine which spins a fan to cool the GPU.
-
The steam engine should be hooked up to a generator which powers the rest of the computer.
or to the UPS which keeps you safe from the inevitable blackout your LAN party is going to cause.

-
:)


RE: and at $600 so what
By Fleeb on 11/8/10, Rating: -1
RE: and at $600 so what
By Goty on 11/8/2010 12:28:19 PM , Rating: 5
Your opinion is wrong.


RE: and at $600 so what
By mcnabney on 11/8/2010 12:31:06 PM , Rating: 2
That would be Stirling.

And the waste heat of a 480, run through a Stirling, could easily power every fan in a case.


RE: and at $600 so what
By Fleeb on 11/8/2010 1:20:39 PM , Rating: 2
Sorry about that spelling. If this is a joke about the card running too hot, then I give up. It's just that there was this hardware review for a cooler that uses such and the result was insignificant (I forgot which one). Thus, based on that, I think it is not efficient.


RE: and at $600 so what
By Fleeb on 11/8/2010 1:28:03 PM , Rating: 2
BTW, I like your response better than the guy who just said your opinion is wrong. *Scratches head* At least yours clarifies some things for me.


RE: and at $600 so what
By nafhan on 11/8/2010 11:10:01 AM , Rating: 5
The important thing for a chip company is die size vs. performance. Die size determines your minimum price point due to manufacturing costs, and performance - obviously - determines your maximum price point. Nvidia has not been doing so well on the die size / performance ratio lately. The GTX 460 has a bigger die than the 5870, and the 6870 has a similar die size to the GTS 450.
Obviously, there's more that goes into a card than just the GPU, but I still think that's interesting to consider.


RE: and at $600 so what
By therealnickdanger on 11/8/2010 11:21:02 AM , Rating: 2
For the crowd that buys $600 cards, all that matters is the performance, no matter the die size.


RE: and at $600 so what
By Motley on 11/8/10, Rating: -1
RE: and at $600 so what
By room200 on 11/8/2010 1:00:41 PM , Rating: 5
Was that a sniff I heard as you puffed on your pipe?


RE: and at $600 so what
By kattanna on 11/8/2010 1:51:17 PM , Rating: 1
oh man, thanks for my morning chuckle

quote:
You must be new to computers, huh


if by new you mean i have only been using computers since before the hard drive was even something people thought about having at their home and a tape drive was king, then yeah, im "new" to this.

quote:
Then again perhaps if you can't afford it you might look to a less expensive hobby.


im not even going to list my computers i have at home as it would make you jealous.

But i would like to award you with the ASSumption of the day award though.

anyways, will nvidia sell some cards? sure, WHEN they actually show up. by all reports this is nothing more right now then a paper launch to steal press away from AMD and its releasing of its new cards which are ACTUALLY shipping. performance issues and seriously screwed up naming schemes aside, AMD is clearly poised to be the new market leader for at home GPU gaming needs.

and i say this as someone who has multiple CUDA cards in one machine because i actually use that.

and then when we look out to the next product cycle AMD is going to simply OWN. new machines will start shipping with built in AMD GPU cores on their multiple core CPU's which will give enough performance for the general masses. for those that need/want additional performance will be able to add in a separate GPU card which will then kick in x-fire usage.

nvidia simply will not be able to withstand that, and i am not the only one seeing that. others do to, and are starting to buy appropriately.


RE: and at $600 so what
By kmmatney on 11/8/2010 4:43:14 PM , Rating: 4
Riva TNTs were never that expensive. Here is the original press release - a Riva TNT for $150

http://www.nvidia.com/object/IO_20020109_4341.html

That was too much for me - I bought a 3DFx Banshee for $80 back then. I've always been able to get great cards for around $100 until the last few years.


RE: and at $600 so what
By tviceman on 11/8/2010 2:58:39 PM , Rating: 1
Hurray for misinformation, but the actual price launch is $499. You were only a $100 off. but nice try!


RE: and at $600 so what
By Assimilator87 on 11/9/2010 12:56:34 AM , Rating: 2
Two things that really bug me:
-First they completely skip the GTX 300 name and now they waste another generation of names.
-I don't understand how GF100 has so many trannies and only barely edges out the Radeons. With Cayman matching that number, nVidia's gonna get mopped!


They have a window of a couple of weeks....
By Amiga500 on 11/8/2010 11:16:44 AM , Rating: 1
To get this thing out the door or this comment:

quote:
"This is the fastest DirectX 11 GPU on the planet."


Becomes invalid.

This may even be the last time Nvidia ever holds the crown, so the fanbois should make the most of it.

[I expect Nvidia's demise to gather pace over the next 12 months. In two years, unless they have been acquired by someone else, they will be staring death in the face.]




RE: They have a window of a couple of weeks....
By Pirks on 11/8/2010 12:06:17 PM , Rating: 2
quote:
they will be staring death in the face
Only the Geforce division may die. The Tesla division will prosper. Chinese will buy all Teslas Nvidia makes to build their HPC clusters.


By Amiga500 on 11/9/2010 2:40:13 AM , Rating: 2
It will die too.

Who is going to use Teslas when you can use an APU, have direct access to complex instruction ops and share that memory?

Tesla is doomed.


By StevoLincolnite on 11/8/2010 12:15:03 PM , Rating: 3
quote:
[I expect Nvidia's demise to gather pace over the next 12 months.


I really hope not, we need nVidia and AMD slogging it out to keep prices down.
The 5xxx series remained fairly high priced for most of it's life in comparison to the 4xxx series which had better competition at each price point.


RE: They have a window of a couple of weeks....
By Luticus on 11/8/2010 12:17:07 PM , Rating: 3
"Fanboi" wars over graphics cards.... REALLY?? My god the dumb things people worry about.

quote:
I expect Nvidia's demise to gather pace over the next 12 months. In two years, unless they have been acquired by someone else, they will be staring death in the face.
People have been saying this for the better part of a decade... Good luck with that.

I like both amd/ati and nvidia and while lately i've been more on the side of amd than nvidia it's simple due to the fact that they have better price/perfomance and better cooling. This can change and honestly, i hope it does. There aren't many real competitors in the graphics card market so it would really suck to see one die off.

I think people need to give all this "fanboi" crap a rest... anyone who says "fanboi" sounds like a tool.


RE: They have a window of a couple of weeks....
By DEVGRU on 11/8/2010 12:59:36 PM , Rating: 4
quote:
I think people need to give all this "fanboi" crap a rest... anyone who says "fanboi" sounds like a tool.


Agreed. Its like those retards that shout "who dis, who dat, geaux Saints!" Makes me wish Katrina was followed up with a nuclear strike to clear up the trash.


RE: They have a window of a couple of weeks....
By B3an on 11/8/2010 1:16:05 PM , Rating: 1
Fanboys are the scum of the earth. The AMD/ATI one seem to be possibly the worst though, especially over the last couple of years. There seems to be more of them and they always pop up on articles like this.

Think about it for a moment... how sad are these fanboys? They really are beyond pathetic, ridiculously immature, and must have absolutely no lifes.
I actually wish they would all just die, the internet would be such a nicer place. It's hard to get useful help of many forums because of all this bias fanboy scum that floods forums and the internet.


By Luticus on 11/8/2010 1:38:16 PM , Rating: 2
I think you misunderstood my post... I'm going to come across as a bit of a prick here and i really don't want to, though i feel it needs to be clarified that what i meant by my post was that people who use the word "fanboy" or any of it's versions to describe another person are what i would consider "less than respectable". It's like when you're playing a video game and you have little 13 year old kids calling you a "noob" because you killed them with a skill or a rocket launcher (perfectly legit) and they don't like the way you play. Seriously, if you disagree then just disagree... do we really have to resort to name calling... or are these people in-fact 13? Everyone has brand preference to some degree. people find products that they like and that work well for them and they also like to brag about the things they own. it's human nature and it's where all of this "fanboy" garbage originates. 2 people comparing possessions and egos... sounds retarded to me.


RE: They have a window of a couple of weeks....
By mostyle on 11/9/2010 6:47:47 AM , Rating: 3
quote:
Fanboys are the scum of the earth. The AMD/ATI one seem to be possibly the worst though, especially over the last couple of years. There seems to be more of them and they always pop up on articles like this.


Sadly I agree with you but must say that even though the fanboys are irritating they have their place in the market scheme of things. Is is they who create the demand for the ultra high end parts that in the end make the mid range items better. What I don't understand is why it's a particular brand that people are loyal to. If you've watched the video card tech over the last few years you've probably seen AMD win a round then Nvidia with a bit of back and forth. When I would consider purchases I'd go with best product at the time regardless of brand. If you're after performance why wouldn't you get the best performing product regardless if manufacturer?

I guess to surmise I can understand being a 'fanboy' to performance but not to a brand.

-Tony


By Luticus on 11/9/2010 8:34:50 AM , Rating: 2
I think it's just human nature. Look at football, people fan up for that just the same. People like to decide themselves into teams and cheer for their side while at the same time shunning the other side. Sad, but true. While i think it's a bit ridiculous (hence i don't watch football) but at the same time i feel it's understandable. People like to be on the winning side and they hate swapping sides once they've chosen.

For me, it really is all about price/performance, usability, functionality, and quality.


RE: They have a window of a couple of weeks....
By Aenslead on 11/8/2010 2:37:49 PM , Rating: 2
I'm sorry, but with all due respect - are you crazy, or just plain stupid , Forrest?

NVIDIA will die in the next 12 months? Sir, you OBVIOUSLY know nothing of the industry, the market, the economy... or computers for that mater. You are not worth the time to correct, so I'll just say: good day.


By StevoLincolnite on 11/8/2010 5:27:08 PM , Rating: 2
Worth noting that nVidia has 2-3 Billion smacko's sitting in a large safe somewhere guarded by Tutu wearing exploding chickens.
Hence it's safe to say they have enough cash on hand to ride out a storm should it occur.


By Amiga500 on 11/9/2010 2:54:18 AM , Rating: 3
quote:
Sir, you OBVIOUSLY know nothing of the industry, the market, the economy... or computers for that mater.


If you are going to lecture someone about being stupid, might be worthwhile making sure your subsequent message has no spelling mistakes.

It might also be worthwhile to learn the difference between "demise to gather pace" and "die".

Anyway, leaving that aside; long term, what have Nvidia got?

x86 license? Nope. No CPU market.
Ability to integrate onto APUs? Nope, meaning:
- No lower end GPU market.
- No HPC market. Can you say "shared system memory"?
- No console market. All about the margins.

They will never be able to sustain the R&D investment required to produce high end GPUs if they can only sell to the ever decreasing (as the result of ever improving APU performance) high-end consumer market.

Their Quadro line of workstation graphics cards *may* continue to lead the way, through drivers as much as anything - but with that line having to shoulder the R&D costs (see above) its profitability will be much reduced, leading to associated reductions in R&D budgets over time.

They have nothing in their box of tricks that will be relevant to the market in 4 years.

Their stock price matters not a jot, their cash reserves matter not a jot. They don't have the license nor the technology.

It is indeed yourself who knows nothing about the industry or computers. The market and economy are irrelevant.

Nvidia need to get out of the x86 market; whether they do that through building high performance ARM CPUs or not is up to them. Right now, they are going nowhere but to the wall (or the Matrox niche equivalent).


By walk2k on 11/11/2010 2:15:39 PM , Rating: 1
Yeah if by "dying" you mean 59% marketshare compared to 33% for ATI.

LOL http://store.steampowered.com/hwsurvey/?platform=p...

Btw the 580 is selling for $470. Still too much for me but nowhwere near $600.


Vapor Chamber
By Supa on 11/8/2010 12:18:22 PM , Rating: 2
The advantage of vapor chamber is that it can transport heat a great distance in a short amount of time. With the thickness of this miniature chamber just some millimeters from bottom to top, it just doesn't do much besides a nice name for marketing purpose.

And with vapor chamber, you added the problem of corrosion and diminishing effectiveness of dissipation of heat. Gamers who replace card within one year or two should be fine, but it might be a problem for those who want to continue using the vapor chamber card on their secondary/older computer.

---




RE: Vapor Chamber
By geddarkstorm on 11/8/2010 1:01:21 PM , Rating: 3
So, I guess this means the new GeForce is vaperware


RE: Vapor Chamber
By superstition on 11/8/2010 1:49:53 PM , Rating: 2
Sapphire's Vapor-X cooler has received high praise from reviewers.

Why?

Because it provides a quieter experience and it cools the GPU dramatically better than a standard air cooler.

That product isn't vaporware, and there's no reason to suspect Nvidia's will be either.


RE: Vapor Chamber
By Supa on 11/8/2010 2:54:24 PM , Rating: 2
Sapphire's Vapor-X cooler also use much bigger heat sink (compare to stock) and also added a few heat pipes. Take away those two differences, how much performance can be attributed to the vapor chamber alone? I think very little if any.

Of course for marketing, "We have better heat sink/heat pipes" just doesn't sound as nice as "vapor chamber" technology".

---


RE: Vapor Chamber
By kmmatney on 11/8/2010 4:35:14 PM , Rating: 2
My experience with the Vapor-X (this was with the HD4890) was that it was still a bit too loud. I just went with the much cheaper MSI HD4890 and an Accelero S1 Rev2, and it was much quieter and cooler. It also cost less - even spending extra on the Accelero. Besides the Vapor-X cooling, the Vapor-X cards also tended to use better capacitors and circuitry, which made them nice, but expensive, cards.


RE: Vapor Chamber
By boobo on 11/8/2010 4:39:30 PM , Rating: 2
I think that was a joke... vaporware... because it has a vapor chamber in its hardware :P


two questions
By Saist on 11/8/2010 1:13:16 PM , Rating: 4
I have two questions for Nvidia, although I suspect they would not be answered even if somebody from Nvidia cared to answer them.

First: Is this new graphics card going to be profitable?

Thing is, Nvidia was having to wrap the equivalent of a $20 bill around the chips used in the GTX 200 series cards. The physical manufacturing cost of the chip and the card itself to support that chip were heavily in excess of the cost of producing an AMD design with similar rendering through-put. With the GTX 470 it is now said that Nvidia is wrapping the equivalent of a $50 bill around each chip, and the new prices on the GTX 460 are forcing Nvidia to take a hit there to.

So, is this new chip actually going to be profitable? Will Nvidia be able to sell this chip to board vendors, and will they be able to make a video card out of that chip that is price competitive with AMD designs, without having to cut component quality or forcing Nvidia into bankruptcy?

Second: Is this chip thermal efficient?

The big question. Nvidia is vending off multiple class-action lawsuits and is suffering from vendor ire over several harmful business practices, such as failing to disclose any particular chip's real thermal properties. The GTX 480 and GTX 470 followed in the trend of Nvidia, consuming in-ordinate amounts of power, putting off loads of heat, and very much like a smoke bomb, not actually doing anything in the process. Performance per Watt, a popular metric for the energy conscious, put Nvidia's first Fermi attempts at well... um... Actually I think the only processor that ran hotter and used more power while giving less performance was perhaps the Pentium 4 Prescott.

So, is this new chip carry the Nvidia trend of the past however many years? Does it use far more power and put off far more heat to only slightly edge out it's competitor's products on paper?

The reality is, in actual games, on any given processor, the GTX 470 wouldn't buy gamers any performance advantages over a RadeonHD 5850. The GTX 480 wouldn't actually buy gamers any performance advantage over a 5870. In consumer terms the GTX 470 and GTX 480 were, and are, rip-offs . Nvidia's issue is that the Halo Effect doesn't exist. The vast majority of consumers don't buy any particular brand because that brand has the fastest Ultimate Card. Intel's dominance in terms of graphics solutions sold is ample evidence of this.

AMD has increased their marketshare not on the basis of having the worlds fastest graphics card that consumers can buy. AMD has increased their marketshare by having the fastest graphics cards for any given price point, being honest with vendors about their chips thermal properties and power consumption, and by helping Vendors implement product-neutral graphics solutions.

The concern with Nvidia's "new" chip is that it will follow in the original Fermi model. The concern is that Nvidia itself will follow the trends they, as a company, have set.

The bad news for everybody else is that Nvidia's corporate behavior is why the company is facing lawsuits, and possibly won't exist as the consumer graphics card vendor as early as next year.




RE: two questions
By smookyolo on 11/8/2010 1:31:52 PM , Rating: 2
"The GTX 480 and GTX 470 followed in the trend of Nvidia, consuming in-ordinate amounts of power, putting off loads of heat, and very much like a smoke bomb, not actually doing anything in the process."

My 470 takes no more than 250w, and maxes at about 80c at maximum usage, fan at about 60% (barely audible). It also performs slightly better than my friend's 5870, and we both have similar CPUS/RAM. And the compute performance is much better than the 5870.

You don't even cite a single reference or anything at all. Did you actually read up on anything?

A lot of people on sites such as newegg constantly give reviews stating that their 470 gets temps of over 95c, I'd be willing to bet that the case has inadequate airflow.


RE: two questions
By Donkey2008 on 11/9/2010 6:32:09 AM , Rating: 2
I will have to agree that the temps that I have seen recorded for the GTX 470 in various web reviews seem way higher than what I actually experience. With the reference cooler while gaming for hours on end, my 470 never tops 83C load or 40C idle in an Antec 900 Two case, with no side fan. For reference, my previous 4890 topped at 78C load and 60C idle.

If I end up paying a few more dollars more in electricity every year because I own Nvidia, it doesn't bother me as long as the end experience is good. From my limited time with the 470, it seems that all of the crazy myths about needing a nuclear reactor to run it are exagerated and there is no fire spewing out of the back of my case. The heat output and noise level is about the same as all of my previous cards - X1950, 8800 GTS, 4850 and 4890.

My 2 cents.


For $600 I could buy the following
By sapiens74 on 11/8/10, Rating: 0
By therealnickdanger on 11/8/2010 11:18:16 AM , Rating: 4
And both would be much slower than the PC that a card like this should be installed in.


GPGPU
By Shig on 11/8/2010 1:24:33 PM , Rating: 2
I think it's funny that everyone on this site thinks Nvidia is all about the 'gamer'. THEY ARE NOT ANYMORE.

They're new head scientist and lead project designer is from HPC. Their new Fermi architecture is *specifically* geared towards HPC. Look at the new 2.5 PF super in China that just came online, that was a multi-million dollar order in itself. The overall heat generation from a GPU is NOTHING compared to how much heat the equivalent of CPU's would use.

'Mid-range' GPU's are quickly going to be obsolete by CPU's like Fusion and Sandy Bridge that give equivalent graphics performance directly off the die. The '50-100$' GPU market simply won't exist in the coming years.

The short sightedness on this site...




RE: GPGPU
By StevoLincolnite on 11/8/2010 1:59:50 PM , Rating: 2
quote:
'Mid-range' GPU's are quickly going to be obsolete by CPU's like Fusion and Sandy Bridge that give equivalent graphics performance directly off the die.


No they won't, I'll tell you why.

Memory bandwidth.

Mid-range cards are moving to/are on GDDR5 memory which offers more memory bandwidth than DDR3 ram in Dual-Channel mode.

Low-end cards like say... The 5450 would be fine with DDR3 memory as it's not powerful enough to use most of that bandwidth.
But using on-board memory for something like a 5670 or 5750 will not yield mid-range performance, especially at higher resolutions.

Then you have die-space.
On that one chip you need to fit in: 2-8 cores, controllers, cache and the GPU, adding a mid-range GPU would drive up the size and transistor count of the chip massively, reducing yields, being power hungry and throwing out significant amount of heat, and it would be costly.

Things like Fusion based chips are just an evolution of the typical IGP, they will not replace any hardware meant for games any time soon.


Inadvertent smear?
By DominionSeraph on 11/8/2010 9:02:42 PM , Rating: 3
Jason,

quote:
NVIDIA clearly is still struggling with heating issues, so rather than utilize a noisy, high flow fan like the 400 series, this time around it is opting to pay a bit more for a nicer solution.


It is the GF100-based GTX 480, GTX 470, and GTX 465 that have the power and noise issues, not the rest of the 400 series line-up.
The GF100 is a turd, but the GF106, GF108, and especially GF104 are different things entirely and shouldn't be lumped in with it under, "400 series."

If the GTX 460 was a niche product I'd give the nod to the flow of your prose over technical accuracy; but, since release, the GTX 460 has probably been the most recommended card on the Anandtech forums. Even the release of the 6850 and 6870 hasn't supplanted it -- Nvidia's price drops prior to their launch just made the Radeons slot into Nvidia's pricing scheme for a quite simple GTX 460 (768MB)-->6850-->GTX 460 (1GB)-->6870 hierarchy, with the caveat that the 6870 apparently has very little overclocking headroom.





Why don't you guys...
By smookyolo on 11/8/2010 1:21:59 PM , Rating: 2
...Actually watch the demo video? This card is going to kick serious ass.

Also, I always see people talking about power usage and heat like it's the freaking apocalypse. I never think about heat when I buy a card, my 470 never goes above 80 with stock cooling, and I never hear it.

Power usage? Unless a card takes a disgusting amount of power, like 1000w+, what's the big deal? If you can't afford that much power, maybe you should not get a top of the line card...?




What took so long
By tastyratz on 11/8/2010 1:45:38 PM , Rating: 2
Vapor chamber is a nice way of saying fat heatpipe. Heatpipes have become the cooling standard for years in many staples like laptops and enthusiast cpu heatsinks. They think its so special? I think its stupid it took them this long. This is hardly a wow worthy announcement. Why they hell have they NOT been using heatpipes only since the 8800 on?




Antilles is not single-GPU
By Kaldskryke on 11/8/2010 4:52:52 PM , Rating: 2
quote:
AMD's aims for single-GPU supremacy rest on the Radeon HD 6990, an Antilles card set to launch before the end of the year.


Just like Hemlock, Antilles is expected to be a dual-GPU product, using two Cayman GPUs. This information was revealed in the Catalyst 10.8 release two and a half months ago. Unless AMD achieves an efficiency miracle, the Cayman GPUs in Antilles will be lower clocked and/or cut down. So really the GPU to look for from AMD will be Cayman, filling the roles of 6970 and 6950.




Nothing about the cooling
By ira176 on 11/12/2010 2:24:42 AM , Rating: 2
is new. Sapphire has been implementing vapor chamber cooling on certain graphic cards they've made for years.




End of innovation
By Ammohunt on 11/8/2010 2:07:21 PM , Rating: 1
That Nvidia seems to be putting more effort into cooling systems rather than solving heat issues perhaps by a die shrink pretty much proves to me that all they are doing is over-clocking old designs and calling them new and failing to innovate. NVIDIA really needs to catch up to the times the future is wattage vs. performance.




LOL
By Soldier1969 on 11/8/2010 9:20:14 PM , Rating: 1
Really Nvidia? You were late to the DX11 game and now this heater monstrosity. Yeh good luck with that heat output and cost to the consumer. I'll be just fine with a AMD 6970 2Gb for my 2560 x 1600 res and gaming in 2011, thanks.




"You can bet that Sony built a long-term business plan about being successful in Japan and that business plan is crumbling." -- Peter Moore, 24 hours before his Microsoft resignation














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki