backtop


Print 97 comment(s) - last by sonofbc.. on Jul 11 at 10:29 AM

Storm clouds are gathering as NVIDIA faces a reinvigorated competitor

As the old saying goes, when it rains it pours.  NVIDIA was performing beautifully thanks to aggressive pricing and performance of its 8000 series of graphics cards.  It looked poised to leave competitor AMD (formerly ATI) in the dust.  However, the latest round in graphics war has marked a dramatic turnaround with AMD's 4850 and 4870 outperforming NVIDIA's offerings at a lower price

While NVIDIA still holds a tenuous grip on the highest end offerings, with its GeForce GTX 280 GPU, this might soon slip, depending on the performance of AMD's dual processor 4870 X2 (R700) card, likely coming in Q3 2008.  Meanwhile, NVIDIA faces challenges from Intel in its low-end and laptop graphics offerings, and from AMD's PUMA chipset/graphics package in the laptop market.

The economic repercussions of NVIDIA's slippage are already visible.  NVIDIA announced yesterday that it was going to turn in revenue of $875 million to $950 million for Q2 2008, which ends July 27.  This is significantly lower than the current analyst expectations of $1.1 billion.

That was not the end of the bad news from NVIDIA either.  It announced that it was facing a massive recall, due to overheating GPUs in notebook computers.  NVIDIA reported higher than average failures in both the laptop GPUs and in laptop chipsets.

NVIDIA said that the chips and their packaging were made with materials that proved to be too "weak".  NVIDIA passes the blame to notebook manufacturers, which it says contributes to the problem.  Typically notebooks have poorer ventilation and components concentrated in a smaller space than desktop computers.

The result of the recalls is that NVIDIA will be taking a onetime charge of $150M USD to $200M USD to cover the damages.  It plans to use the money to repair or replace defective parts.  It also hopes to collect part of the money from insurers it uses.  However, it has acknowledged its problems and switched the materials it uses.

The news has resulted in NVIDIA taking a beating on the stock market, sliding over 25 percent.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Ow
By MrBlastman on 7/3/08, Rating: 0
RE: Ow
By JasonMick (blog) on 7/3/2008 11:22:52 AM , Rating: 5
I think more importantly, it's a validation of AMD's assertion that monolithic GPUs just aren't what the market needs. One key reason why the 4850 and 4870 are so competitive is that they use smaller GPUs, more akin to Intel's multicore strategy.

Meanwhile Nvidia has build this huge GPU for the 200 series, that is incredibly power and incredible from a design standpoint, but cost much more to make, much more to produce, and is out of the price range of most customers. Inherently huge GPUs mean huge dies, which means lower yields, which finally means higher prices. This is just not a successful strategy.

It should be interesting whether Nvidia keeps with the "design big" mentality, or whether reality sets in for the next round.


RE: Ow
By DEVGRU on 7/3/2008 11:46:21 AM , Rating: 2
...Lest we forget usually much higher power consumption, and higher temperatures...


RE: Ow
By Goty on 7/3/2008 11:49:57 AM , Rating: 2
Well, the actual power consumption isn't a whole lot more than the RV770 GPUs, and the temperatures are actaully not too bad, either.


RE: Ow
By nosfe on 7/3/2008 12:44:49 PM , Rating: 2
well, they did say that based on polls noise was a bigger concern to their users then temperature and to be frank, that's also how i feel about it, i prefer it to be a little hotter then to have a hairdryer in my case, also there are unofficial fixes for the temperature problem(make the fan spin at ~40% and lose ~20C at idle without much noise added, or so i've read in the forums)


RE: Ow
By thestain on 7/3/2008 2:33:59 PM , Rating: 2
How about more details?

From quick reading of various news stories, no mention is made of which gpu's on which motherboards are or have had these problems and which laptop makers. Did Dell or some other big lap top maker screw up?

Which motherboards are having these problems? No mention of any particulars... please tell us more!!


RE: Ow
By flipmode on 7/3/08, Rating: -1
RE: Ow
By JasonMick (blog) on 7/3/2008 12:11:36 PM , Rating: 5
No, read Anandtech's exemplary breakdown on the topic.
http://anandtech.com/video/showdoc.aspx?i=3341

Yes, RV770 is biggish when compared to Intel processors, but it is smaller that the GT200 dies. Yes, its @ 55 nm and the GT200 is @ 65nm, but still the normalized die for the GT200 is substantially bigger.

And even if you toss out the die there's other key decisions which AMD wisely made, such as the decision to move to 55nm, the adoption of GDDR5, etc.

And to state my bias I really prefer neither AMD nor Nvidia, rather whichever has the best offering. My laptop currently has an Nvidia 8400 GT in it (hopefully it won't die!! -_- ) and I don't own any AMD products.

That said I still think AMD has made a number of key good decisions that have given them a superior position in the current round of graphics cards, and you'd be hard pressed to debate that without showing a strong bias.


RE: Ow
By flipmode on 7/3/08, Rating: -1
RE: Ow
By JasonMick (blog) on 7/3/2008 1:15:30 PM , Rating: 4
Umm...
From dictionary.com
mon·o·lith·ic
...
5. characterized by massiveness, total uniformity, rigidity, invulnerability, etc.

I guess you could call the RV770 in the sense its one piece, but its more frequently used to mean "big" these days. Using definition 5, the GT200 is more monolithic than the RV770.

Vocabulary is often like that... like how "pert" could mean healthy, but more frequently its used to describe an impertinent (rude/saucy) person. ;)


RE: Ow
By flipmode on 7/3/08, Rating: -1
RE: Ow
By Mitch101 on 7/3/2008 2:16:55 PM , Rating: 2
Jefe would you say I have a monolith of a gpu?

http://www.imdb.com/title/tt0092086/quotes


RE: Ow
By Bruneauinfo on 7/3/2008 3:12:51 PM , Rating: 2
LMAO!!

and while we're at it lets get caught up in some semantics.

apparently, ATI is lucky they use good materials.

and nVidia probably won't be buying AMD anytime soon.


RE: Ow
By mathew7 on 7/4/2008 1:51:44 AM , Rating: 2
In your quote the missing part is "high/top performance". So for high/top-performance cards the single-chip solution is over. Basically what they are saying is that they produce a mainstream chip and combine more of them for high-performance. But we all know how SLI/Crossfire does not double performance. The overhead of more chip management kills it (at least in present time titles).


RE: Ow
By afkrotch on 7/7/2008 5:32:01 AM , Rating: 2
Let's not forget the need for driver updates that provide profile setup for games. This is where I find the biggest flaw in multiple-gpu setups.

If you're game doesn't get a profile, you won't get the most performance out of the cards. AMD recently put in the profiles for Bioshock and The Witcher. Well I stopped playing Bioshock about 2-3 weeks after it released and I won't touch The Witcher with a 10 foot pole.

This is the whole reason I haven't bothered with a multiple GPU setup. Both strategies from AMD and Nvidia have their merits. I'm just with Nvidia on this one. I prefer having a single GPU.

Less hassle for me to watercool, less cables, no wasted slot, no need for some 8000w PSU, and so on.


RE: Ow
By TomZ on 7/3/2008 2:05:13 PM , Rating: 3
quote:
5. characterized by massiveness, total uniformity, rigidity, invulnerability, etc.

A "tech" site should stick with the "tech" definition, which is the one the OP references. Technicians and engineers in the field don't just mean massive, etc. when they say "monolithic." Your use is more of a layperson.


RE: Ow
By MamiyaOtaru on 7/4/2008 1:08:11 PM , Rating: 2
I'd say pert is more often used to describe jubblies


RE: Ow
By TheJian on 7/3/08, Rating: -1
RE: Ow
By flipmode on 7/3/2008 1:55:23 PM , Rating: 2
Whom are you responding to?


RE: Ow
By JasonMick (blog) on 7/3/2008 2:53:53 PM , Rating: 4
Someone had a busy day at the nvidia koolaid stand I think...


RE: Ow
By TheJian on 7/7/2008 2:57:09 PM , Rating: 1
My reply was to you (sorry flipmode). No koolaid involved. Nvidia already dropped their GTX280 to $459 at newegg (couple cards after rebate, and quite a few under $500). The GTX260 is now $329 at newegg. So what I said has already happened. They'll drop pricing to make AMD's cards worth less. They just did. You can expect more cuts the second 4870x2 comes out. By then 260/280 will have a die shrink just about out the door to easily allow this and add performance. I'm not saying I LIKE nvidia, I'm saying this is what's going to happen. Currently I'd buy a 4850/4870/GTX260 (toss up 4870/GTX260). But I'd also say ATI won't look quite so good after another cut from Nvidia. $460 isn't bad for king of the hill. That's $140 less than quoted in all these reviews of it. Quite a price cut in ONE month eh? The GTX280 doesn't look so bad now. Remember that we used to have $499/$599 cards to get top of hill performance. Right now that's only $460. That's a great buy all of the sudden. My point is, AMD has a great pair of cards, but their performance/buck was only great when GTX260 was $450 and GTX280 was $600. At $329 and $460 things change. The reviews should be updated showing this since it happened so fast.


RE: Ow
By carl0ski on 7/4/2008 12:25:00 AM , Rating: 2
quote:
Why don't people know the definition of monolithic? From dictionary.com: consisting of one piece; solid or unbroken RV770 is monolithic - it's one piece of silicon. That's all I'm saying.


Arguably
AMD Barcelona Quad Core and Intel Core 2 duo are monolithic
IBM Cell is monolithic.

however they are modular monolithic
they have the potential to disable part without rendering the entire device inoperable.
prior to being built they may also leave sections out of the construction phase.
ie Cell processors with 3 6 or 9 cores are available.

http://www.anandtech.com/video/showdoc.aspx?i=3341...
A SIMD core is very similar to NVIDIA's SM with a couple of exceptions:

1) There are more SPs in AMD's SIMD Core (16 vs 8)


In theory AMD can completely remove 4, 8 or 12 of those SIMD (SP) cores to make the device smaller, consume less power and far cheaper.

or better dynamically disable unused ones to conserve power.


RE: Ow
By Clauzii on 7/9/2008 9:55:52 PM , Rating: 2
Reg. CBE, the PS3 has 7 working cell-units, of which 6 is under user control, so there are 8 cores too :) And (You probably know) one core is NOT a CBE but a PPC acting more like the master control.


RE: Ow
By psychobriggsy on 7/3/2008 12:20:08 PM , Rating: 2
Yeah, but ATI will have their X2 cards out soon (August at the latest) which will put two *cheap* GPUs on a card to outperform the vast, huge, GTX280.

It is interesting that the RV770 is pad bound, i.e., they can't make it any smaller unless they cut the I/Os. It is actually still a large die, but half the area of the GT200, which means it is significantly less that half the price to make (due to the economics of fabbing) and I'm sure that AMD are storing the slightly defective (some dead shader cores) dies up to launch a 4650 or similar product soon as well, for $149 or less.


RE: Ow
By flipmode on 7/3/2008 1:39:02 PM , Rating: 1
Yep, but why are people already using the monolithic term for RV770 when it's R700 that's monolithic?


RE: Ow
By misuspita on 7/3/2008 3:10:00 PM , Rating: 2
I find it a little hillarious that AMD chose the path to beat Nvidia that got them beat up in the CPU market by Intel.


RE: Ow
By carl0ski on 7/4/2008 12:37:51 AM , Rating: 2
what the slower but performance in high numbers category?

4 socket and 8 socket quad core opteron's still perform and sell very well.

it is like a large number of small predators using sheer numbers to overpower very large prey.

ATI is probably banking on placing 2 small 4870 chips using the same real estate on a discreet card as 1 enormous x280.
as that number doubles
NVIDIA fits 2 x280 on one board ATI can aim for 4 small 4870.
Scaling far higher.


RE: Ow
By MonkeyPaw on 7/3/2008 3:33:51 PM , Rating: 2
Well, it appears that RV770 still has room to scale, and soon:

http://www.xbitlabs.com/news/video/display/2008070...

quote:
Diamond “Unlocked” Radeon HD 4870 XOC Black Edition graphics card comes with graphics processing unit clocked at 800MHz and 512MB of GDDR5 memory operating at 4400MHz, up from 750MHz/3600MHz on reference design ATI Radeon HD 4870 graphics adapter. Moreover, according to Diamond Multimedia, the firmware of the board was modified and the board can be overclocked even further and can leave behind the current flagship Nvidia, the GeForce GTX 280.

“The firmware was custom designed to enable end users to go beyond the normal over clocked speeds and allow them to push their cards for higher performance via the Catalyst Control Center. The GPU’s custom firmware has been unlocked to push cards to GPU settings of up to 950MHz and Memory of up 1200MHz,” said Mr. Gastelum.


That could mean there will be a 4890 with even higher clocks. The 4870X2 may not even be needed to topple the GT200 if the RV770 can scale that much higher.

As for the "Monolithic" comment--it's a little misused in this case. That term applies to Phenom X4 vs Core2 quad. Intel uses 2 dies in MCM, while AMD has one "monolithic" die. The term doesn't work for GPUs until someone comes up with a similar MCM approach. Even X2s and GX2s don't count, since those are more like having 2 cards stuck together, not 2 GPUs on one BGA socket.


RE: Ow
By bill3 on 7/7/2008 8:22:47 AM , Rating: 2
I dont think it was building a monolithic GPU that was so much the problem it was that Nvidia built a monolithic GPU that isn't fast enough for its huge size. It's over twice as big and costly as RV770, but it's not nearly twice as fast. I think if it WAS twice as fast, there would be no real problem here for Nvidia. So blame Nvidia engineering inefficiency, not the size of the chip.

In fact really the main problem with GT200 series can be boiled down to an even simpler one, clockspeed. The GT260 often performs not much better than a 9800GTX. And theres a reason, because it isn't! Both have the same number of TMU's, but 9800GTX's 128 shaders are clocked so much higher,
than GT260's 192, that 9800GTX is not far behind in raw shader power. Adjusted for clock, 9800's 128 shaders at 1690 mhz are equivalent to 166 shaders at GT260's 1296 mhz shader clock. 9800GTX+ gets even closer, being equivalent to 181 shaders. Not only that but GTX has a significant core clock advantage as well. Of course, 260 has more rops, more VRAM, and more bandwidth, but whenever those are not a major factor which is often, you see it isn't much better than 9800GTX.

I think unless ATI has an answer for the upcoming 55nm GT200 revision which will probably be clocked a lot higher though, GT200 series could still end up a winner. Time will tell.

I mean, realistically the 956 million transistors in RV770 make it pretty "monolithic" itself. Just less "monolithic" than the other guy.

In fact I've heard one brave soul argue that in fact AMD's no-monolithic strategy was a grave mistake, in that they probably could have easily taken the single-gpu performance crown outright with this architecture had they been willing to make the chip just a little bigger (say, 30% larger, 60 TMUs and 1200 sp's or something like that). That's an interesting idea, at least.


RE: Ow
By abzillah on 7/3/2008 12:46:59 PM , Rating: 1
I don't see how this is a that bad. Nvidia made $875 million last quarter and is making $950 million this quarter, which is more than before, but less than the analyst predicted. A few people may sell their shares of Nvidia stock, but this is not going to have a negative effect on the company. Can you guys really tell me that this is something bad for the company?!


RE: Ow
By TomZ on 7/3/2008 2:00:50 PM , Rating: 2
It's more a question of expectations. Obviously many investors expected higher revenue, and that expectation was priced into the (higher) stock price. Now that the revenue is lower, the stock is perceived to be less valuable, and so the price goes down. All seems normal to me.

In other words, it's not really a question of "good" or "bad," but more a question of relative value.


RE: Ow
By deadrats on 7/6/2008 1:27:00 PM , Rating: 2
quote:
I'm not worried though, I am sure Nvidia will bounce back over time. They are a well-organized company that produces solid results.


i'm sure many people said the same thing about 3DFX before they went under.

i'll make a bold prediction right now: not only will nvidia be out of business in 3 years, but discrete graphics add in cards as we know them will also be a thing of the past.

why?

because the industry is moving towards hybrid cpu/gpu chips, that's why. microsoft has already announced that new versions of the xbox 360 will use a hybrid cpu/gpu to lower costs and reduce heat output, amd is moving foward with it's plans for an integrated gpu on it's next-next generation of cpu's, intel is obviously moving in that direction, it's discrete graphics card will feature x86 cores and it has already announced that some high end versions of nehelam will have integrated graphic chips (and in another generation of cpu's it will also use dram instead of sram for it's caches).

i personally don't see discrete graphics cards being in any demand once we have 12, 16 or more core cpu's with integrated graphics chips, i just can't see a discrete graphics solution being faster.


Gaming recession or diversion?
By jamesbond007 on 7/3/2008 11:30:18 AM , Rating: 3
Not sure about you guys or your neck of the woods, but around here, things are quieting down as far as games and gaming goes in the community. I use to help a bunch of locals setup and run LAN parties in the region but in recent years, the attendance became abysmal and we quit hosting them.

This leads me to believe that there may be a lack of desire for the high-end cards and commitment to PC gaming. I place blame on the new consoles. :) Most of my personal friends are now gaming online via their Xbox 360 or Playstation 3. Some just prefer the simplicity and convenience in putting in a disc and playing.

I'm not saying this is a fact for everyone's community, but I really miss the fun of gaming in groups at LANs in the area. Anyone else care to comment on their local status?

nVIDIA seems to also be catching a lot of flak from those who do game in regards to their lack of better or more reliable drivers for Vista, too.




RE: Gaming recession or diversion?
By HrilL on 7/3/2008 12:05:19 PM , Rating: 4
I would have to agree with you. A few years back we used to have some rather large and small local LANs. No one seems to play the same kind of games as in the past. It is probably due to the consoles and RPG games that you play online. But overall PC gaming seems to be in somewhat of a slump. All the new kids are playing colsoles now days it seems like. And the hardcore PC gamers are sticking with the games they have been playing for the last 8 years. Myself I only play starcraft and counter-strike and I don't need a new computer to do it. What I have plays those games just fine.


RE: Gaming recession or diversion?
By Mitch101 on 7/3/2008 12:38:17 PM , Rating: 3
I would agree that WOW/RPG's probably killed the lan party to a degree. When FPS was king the lan party was on. Instant gratification to gibb a friend and hear them scream.

Our LAN parties were more a reason to drink beer and redbull's gorging on chips and pizza.

I'm hoping starcraft brings the lan parties back a bit.


RE: Gaming recession or diversion?
By just4U on 7/6/2008 4:08:36 AM , Rating: 2
I would think that it has more to do with the adoption of high speed connections as opposed to anything else.


RE: Gaming recession or diversion?
By gaakf on 7/3/2008 12:40:10 PM , Rating: 2
ahh but you will need a new computer for starcraft 2. I have faith that Blizzard will restore some of the passion PC Gamers have lost for the platform over the next couple years. A new game in the Warcraft, Starcraft and Diablo series as well as a brand new battle.net is one hell of a thing to look foward to.


By SavagePotato on 7/3/2008 12:46:42 PM , Rating: 4
A brand new battlenet can only mean brand new bnet tards of the next generation.

Kind of a mixed bag as far as calling it a good thing.


RE: Gaming recession or diversion?
By SavagePotato on 7/3/2008 12:34:38 PM , Rating: 5
I wouldn't blame the weakening of the lan on consoles.

It goes both ways realy, pc users have access to high speed internet too, and by and large, who wants to trudge to a lan to game when you can play online with perfectly fine latencies any time you want.

Lan's were great back in the day of dial up, now though, not so necessary.


RE: Gaming recession or diversion?
By JoshuaBuss on 7/3/2008 7:05:40 PM , Rating: 2
not to mention free VPNs like hamachi make virtual lans almost as good as local lans.

for me, lan parties were simply because it was impossible to enjoy a game of counter strike when most of your friends had pings over 400..

now over hamachi you can have ~40ms pings with people on the other side of the country


RE: Gaming recession or diversion?
By jamesbond007 on 7/5/2008 12:15:54 AM , Rating: 3
But you can't honestly say that it is equally enjoyable to frag someone over the Net and just see the kill come up on the screen. Kill someone (virtually, of course) at a LAN and hear them scream like a girl is great. Plus, there's always the sweet goodies, prizes, file sharing (yeah, we all do it!) and the interaction with real people. It's a pleasurable experience and no online gaming session has ever come close to the memories I have of LANing.

~Travis


By overzealot on 7/9/2008 5:00:13 AM , Rating: 2
Teamspeak and Ventrilo fulfill those screaming needs.
I still LAN a bit, but I find the quality of opponents far better online.


By MrBlastman on 7/3/2008 1:10:38 PM , Rating: 2
No amount of consoles can get me to switch from PC gaming... ever.

I keep my Wii around because console goodness if fun when I have friends over, but there is no replacement for mouse/keyboard or thousands of dollars in expensive HOTAS/head tracking equipment I need for simming in military simulations.


By StillPimpin on 7/3/2008 3:29:29 PM , Rating: 3
I think one more thing that might be overlooked is broadband. Back in the day of dialup and expensive, slower broadband, it was much easier to grab your rig, take it to the local LAN party and get your frag on.

Now with +6Mb/s connections and in game communications it kind of makes "being there" a non issue. Especially if I can do it all from the comfort of my own home and not have to lug around a big computer and monitor.

I do believe that the current consoles have been steeling some of the LAN parties thunder, but convenience is killing it more.


By kilkennycat on 7/3/2008 6:59:53 PM , Rating: 2
The thing that has really pulled the Nvidia stock price down so drastically in the last couple of days is the $200 million write-off for this quarter and its consequent effect on the quarter's net earnings (probably a loss...). In today's uncertain conditions 3 months of patience is too much of a strain for the big fund investors who own(ed?) 85% of nVidia's stock. Probably a great time for individuals to buy and hold for at least 6 months. nVidia has ~ $800million in the bank and ZERO debt. Remember that the nVidia management team that bound together to survived the Fx5800 debacle that nearly dragged the company under is still intact and the company is financially in great shape.

As for the GT280/260 -- considering the competition/cost-pressure from ATi, I suspect that a 55nm version will be put on a high-priority fast track to production. Lower die-cost, better yield, much faster or lower power (or any combo in between). Since the G92-family GTX has now made the transition to 55nm and the GT280/260 has the same fundamental building blocks but with enhanced data-paths and compute units, such a "shrink" should be pretty straightforward. I doubt if this shrink was in nVidia's short-term 'grand-plan' for near-term production, but history has shown nVidia again and again to be capable of making very rapid corrective business decisions.


How many percent...
By 325hhee on 7/3/2008 1:46:29 PM , Rating: 2
Of gamers really buy those high end $1000 cards? I've always been on a budget, and the most I ever spent on a card was about $300 on the 2900Pro, which I semi regret, but thank goodness for the bios flash. Gave me a bit of a better value.

That said, I swore, I'll never spend that kind of money on a vid card ever again. Though my next purchase will be a 4870, when I can get it for $250 or less after instant and mail in rebates.

As much as I hear people touting the 8800 Ultra, in the games I've played, there were many 3 people that said they own the card, then again, they say they have money to burn. Who knows, but 3 MMoRPGs, tons of FPS games, and it's weird with the people I've played with, there were that little people that said they owned that card.

So it makes me wonder, who out there has the money to buy those cards, and how many Ultra high end card Nvidia sells. And is there truly a profitable market for those cards. I think the thing ATI got right, outside the card, was the budget gamer market. Keep the prices low and affordable. Many people don't want to spend more than $100-$150 on a vid card, and when I go back and look at the amount of money I've spent on vid card, it scares me.

I'm ready to pick up a 4850, but I'm going all in for the 4870, only at a sale price. Nvidia needs to rethink it's market and target the mid level cards, and they have to stop forcing price gouging, as that leaked memo proved that's they were doing. Perfect example, that 9600GT should have, and would have sold fast if they kept it at the $100 price range. Instead they went greedy.




RE: How many percent...
By Inkjammer on 7/3/2008 2:20:11 PM , Rating: 1
As a person who bought a 9800 GX2 three months ago... let me assure you, even those of us with money often regret it. It's a great card, don't get me wrong, but I'd honestly rather have gone with a 4870 or 9800 GTX+ if I could do it again.

And honestly? Yeah, most of the people I know are the kind who buy $600+ cards. Or were, I should say. We used to rush out to buy the 8800 GTXs and whatnot. I went from SLI 7800 GTXs to an 8800 GTX SLI to the 9800 GX2. In fact, over my time I've had three 8800 GTXs (one I fried while modding). But the days of us spending that much are over. All the people I know are looking at the mid-range for a better value. In their minds, they'd rather buy the best mid-rage for $300 and upgrade to the next best card every time it comes out. The high end isn't offering the performance lead it once was.

It's just cheaper in the long run. Buy the best mid-range card for $300, then upgrade to the next best card down the line and sell your old one. You can play in the cutting edge longer while not burning your bank account.


RE: How many percent...
By suryad on 7/3/2008 2:51:47 PM , Rating: 2
Fair point. I have 2 Ultras in SLI, 8800 series of course and even seeing all these benchmarks of these new cards, in SLI, my setup trumps them. All that means though is that I do not have to upgrade my cards for a while. I am looking forward to the 280 die shrink and see if they have any highly oclocked versions released at that time and I am going to grab a pair and give my current ones to my brother. I think if you buy an expensive graphics solution once technically you can last a really long time than buying a mid range and buying one at ever new generation.


RE: How many percent...
By MrBungle123 on 7/3/2008 3:13:55 PM , Rating: 2
quote:
I think if you buy an expensive graphics solution once technically you can last a really long time than buying a mid range and buying one at ever new generation.


lets say you're trying to game with high settings at a relatively high resolution and for the sake of argument that "high end" video cards cost $450.

You go and buy SLI for $900 and I go and buy a single card for $450 during the first year both of us can play all the games at playable frame rates for year 1. During year 2 my card starts to lag so I go and buy a new video card for $450. Now my new card is faster than your SLI rig since I don't suffer from the driver issues and extra CPU overhead that comes along with SLI plus I get all the new features that have been added to graphics cards over the past year. The other thing is that I technically spent less money for it then you did since infalation has occured since the first round of video card purchases making the next years $450 a little less painful than it was the first time.

Personally I'll go with option 2 since it requires less investment up front and I don't see removing a couple screws to replace a video card as being all that much of a hassle.


RE: How many percent...
By MrBungle123 on 7/3/2008 3:00:45 PM , Rating: 3
quote:
I went from SLI 7800 GTXs to an 8800 GTX SLI


OK that makes sense...

quote:
8800 GTX SLI to the 9800 GX2.


Don't you read up on what a product is before you plunk down like $500 on something? IMO you kind of deserve to regret that decision.


RE: How many percent...
By Inkjammer on 7/7/2008 4:43:48 PM , Rating: 2
Yes, I did... which is exactly why I bought the 9800 GX2. I have a high end 3DS Max station, and wanted the power of gaming and 3D graphics.

Due to the layout of recent motherboards like my 780i you are forced to sacrifice valuable PCI-E slots with that second card. I needed the power of both cards... in one. That lost slot impacted me over time from building an idea workstation.


Recall?
By Chadder007 on 7/3/2008 11:39:26 AM , Rating: 3
What laptops or GPU's are affected by this recall? I haven't heard about it before.




RE: Recall?
By Rookierookie on 7/3/2008 11:44:03 AM , Rating: 2
I'm interested in that too, since I do have an Nvidia GPU in my laptop...


RE: Recall?
By nosfe on 7/3/2008 12:48:19 PM , Rating: 2
you really think they'll tell? they'll keep it under wraps until something bad happens, just look at the whole explody-batteries thingy(sure, other company, but still a company)


RE: Recall?
By neothe0ne on 7/3/2008 12:50:27 PM , Rating: 2
I'm also interested; HP gave me a GeForce 6150 Go. There's no link in the Anandtech article to follow for more information either, which I found odd.


RE: Recall?
By taanio on 7/3/2008 3:06:24 PM , Rating: 2
I'm on my 3rd 8400m in my dell xps m1330, the other 2 died with artefacts all over the place before refusing to post. The only other xps m1330 owner I know is also on his 2nd..


Nvidia will bounce back
By tviceman on 7/3/2008 11:30:16 AM , Rating: 2
The gtx200 series isn't the travesty that the 5800fx cards were, but they definitely missed the mark with their newest offerings.

This is great for us consumers, though. AMD/ATI absolutely had to have a highly competitive product with the 48xx series, otherwise they might have surrendered completely to the high end/mid range graphics market. Nvidia can withstand a few bad quarters and get right back on track - which I'm confident they will. The next video chipset they release will likely have much more value and performance improvements.




RE: Nvidia will bounce back
By homerdog on 7/3/2008 11:57:04 AM , Rating: 2
Not only did NVIDIA miss the mark (RV770 comes too close to GT200), but given the market reaction it appears they were aiming at the wrong target ($650 for a graphics card? *_* ).


RE: Nvidia will bounce back
By adam92682 on 7/3/2008 12:16:25 PM , Rating: 2
I would have no problem paying $650 for a card if it could run crysis with a minimum fps of 60 on very high.


RE: Nvidia will bounce back
By homerdog on 7/3/2008 1:04:54 PM , Rating: 2
Right, but most gamers just can't afford to spend that much on a single component regardless of how fast it is. AMD has captured a large audience with RV770, a chip that can compete with the best of the best from NVIDIA in its $300 incarnation while still being profitable and darn fast when put on a $200 card.


RE: Nvidia will bounce back
By jeromekwok on 7/4/2008 3:21:02 AM , Rating: 2
See ATI came back with 4800 performers with a very aggressive pricing. I would not surprised to see a Radeon 4650 with 400 shaders outperforming 9600GT at a lower price tag. This will make Nvidia very difficult to move 9600 and 8800 cards.
Nvidia will not have DX10.1 cards until next year.


By Dianoda on 7/3/2008 2:05:28 PM , Rating: 2
The actual value of company hasn't really changed much since 2 weeks ago, but the market didn't know what it was really worth then. NVIDIA's announcement brought the market new information, changing expectations regarding the firm's future earnings. The price of NVDA stock over the last two days shows strong support that the market is operating under the semi-strong form of the EMH (efficient market hypothesis), where the market price of the security reflects only publicly-available information. If NVIDIA had hinted to the market that something like this might happen, the market probably would never have overvalued the stock so much, and the dropped in price from today's announcement probably would have been better received by the market. In short, the stock's price was artifically higher than it should have been, due to non-disclosure (until now) by the company.

Also, yesterday as opposed to two weeks ago would probably be more accurate. NVDA is down 30% on the day. Ouch.


By TomZ on 7/3/2008 2:08:57 PM , Rating: 2
If you think the stock is oversold, then it's a great time to buy, right! That way, you can find out whether you are right or wrong with your own money on the line.

Buffer = regulation = inherently bad in this case. Better to just let the stock price move as the market sees fit. After all, where's the harm of large price swings. Also people don't just lose money on large price swings; they also make money as well.


By Amiga500 on 7/3/2008 5:35:41 PM , Rating: 2
After all, where's the harm of large price swings.

It results in stupid decisions being made by panicked people.


By kilkennycat on 7/3/2008 7:55:18 PM , Rating: 2
The smart individual investor in for the long term and very knowledgeable about the companies in which he/she invests can take advantage of the dumb activities by the big fund managers. 85% of nVidia's stock was (at least up to yesterday ) in the "capable hands" of big fund managers.... who are in total panic at the moment trying to put fingers in the dikes of their stock portfolios as the stock market dives.

Not a recommendation to specifically buy nV's stock. Just an observation.


TAKING RESPONSIBILITY.
By JonnyDough on 7/3/2008 6:22:49 PM , Rating: 1
quote:
NVIDIA passes the blame to notebook manufacturers, which it says contributes to the problem.


There's a great business strategy. Place blame on your buyers. That's the ticket!

No company can do better than to admit mistakes publicly annd accept full responsibility, as has been demonstrated by Mattel.

Sometimes I feel like I'm the only American that was taught/learned how to apologize.




RE: TAKING RESPONSIBILITY.
By kilkennycat on 7/3/2008 7:42:15 PM , Rating: 2
The author of the Daily Tech must have copied that quote from somewhere else and never bothered to look at the original announcement on the nVidia web-site.

Here is the actual text of the statement from nV's CEO in the nVidia public announcement.

quote:
Regarding the notebook field failures, NVIDIA president and CEO Jen-Hsun Huang stated:-
"Although the failure appears related to the combination of the interaction between the chip material set and system design, we have a responsibility to our customers and will take our part in resolving this problem. The GPU has become an increasingly important part of the computing experience and we are seeing more interest by PC OEMs to adopt GPUs in more platforms. Recognizing that the GPU is one of the most complex processors in the system, it is critical that we now work more closely with notebook system designers and our chip foundries to ensure that the GPU and the system are designed collaboratively for the best performance and robustness "


And he also said:-
quote:
This has been a challenging experience for us. However, the lessons we've learned will help us build far more robust products in the future, and become a more valuable system design partner to our customers. As for the present, we have switched production to a more robust die/package material set and are working proactively with our OEM partners to develop system management software that will provide better thermal management to the GPU


Seems as if there is considerable "mea culpa" from Jen-Hsun here and only a little finger-pointing at the notebook manufacturers..... Considering the danger to one's reproductive capabilities that many high-performance laptops/notebooks exhibit, I personally think that placing at least some responsibility on the system manufacturers for adequate system-cooling is quite justified....

For the complete press-release see:-

http://www.nvidia.com/object/io_1215037160521.html


RE: TAKING RESPONSIBILITY.
By crystal clear on 7/6/2008 7:48:41 AM , Rating: 2
This was the same guy (I mean the CEO) who once said-

"We're going to open a can of whoop ass,"

He now becomes the joke of the industry.......

Well back again to AMD & Intel graphics ...Safe reliable & cheap(AMD)....

AMD couldnt ask for anything better than this,whilst INTEL has a good laugh & says-

"Intel inside everthing"


RE: TAKING RESPONSIBILITY.
By crystal clear on 7/7/2008 6:29:25 AM , Rating: 2
Scanning through forums-I find this.......

See the picture !

Got Fried !
Jul 2 2008, 09:52 AM

http://forums.nvidia.com/index.php?showtopic=71429...

A filing with the US Securities and Exchange Commission states the number of notebooks shipped and sold with the defect were in "significant quantities."

The above picture is just an example......there could be

"significant quantities." out there in the market.

In simple language there must be in their tens (or more) of thousands out there

"Got Fried" who dont post their....

fried eggs on forums.

Remember those exploding Dell laptops,now get ready for a meltdown.


Another thing to consider
By tviceman on 7/3/2008 11:34:35 AM , Rating: 2
When 1/2 of the hardcore gamers owned an 8800GT or better prior to the new releases from nvidia and ATI, there really isn't that much of an incentive to upgrade to a completely different and new card (especially when many of these owners can SLI their cards but have yet to do so). Is a 10-25% performance improvement worth $180-300? Probably not.

Part of the problem is market saturation. The 8800GT and 9600GT were such great cards for the money that nvidia's current cards (and AMD's IMO) don't justify the price to upgrade.




RE: Another thing to consider
By ImSpartacus on 7/3/2008 11:53:18 AM , Rating: 2
Yeah, there's almost no need to get a GTX200 chip, they are too much when the G92's are so capable and so cheap. Granted, the 4800's are superior to G92, but before those came out, a lot of people owned a G92 and there just isn't a reason to switch.

I'm just waiting for 8800GT to hit rock bottom so I can grab one for SLI.


RE: Another thing to consider
By ajfink on 7/3/2008 12:01:39 PM , Rating: 2
I don't think it's that bleak. Some gamers like buying new graphics cards to stay up-to-date. Others may still be playing with an 8800GTS 320 or something along those lines, and when they see a 4850 come along for less than $200 that really outperforms what they're using, they can often be enticed to buy. People will often simply skip a generation because what they're using works good enough and then pounce on the newer generation that really takes this up a notch. This whole online tech community (I don't mean just Dailytech) that is constantly talking about the latest and greatest tech wares really does have a lot of driving force in getting consumers, especially those that are into gaming and computer technology, to keep buying.


oh man, that's not good
By gochichi on 7/3/2008 4:48:22 PM , Rating: 2
I like ATI a lot, I think that when they are up, they are way up (the lil' Radeon 9800 Pro I'm talking).

However, NVIDIA is a little guy, and I don't want to see them hurting too bad either, we absolutely need two companies making awesome graphics cards. A third company would be even better... and Intel may just join the fray in the next year or two.

Anyhow, sorry to hear that NVIDIA is having problems in addition to the 4850. NVIDIA needs to keep power consumption low and partner up with Creative Labs in order to make a sound&video card combo (like ATI but even better). This is one volatile industry (25% stock loss!?)




RE: oh man, that's not good
By lukasbradley on 7/3/2008 5:58:52 PM , Rating: 2
AMD - Market Cap of $3.21B
NVDA - Market Cap of $6.93B

Neither one is really a "little" guy.


RE: oh man, that's not good
By Warren21 on 7/4/2008 1:11:18 PM , Rating: 1
You're missing the point. This stock drop is unrelated to ATI's newest 48xx cards (at least outwardly), and is due to shoddy GeForce 8xxxM product design/materials/execution.

Also, like the other guy just said, NV is twice the size of AMD (the whole company, not just GFX division), and neither are really 'small'.

---------- OT ------------

One other thing -- do you think NV gives a damn about the consumer? No, they give a damn about your money. Who's to say the 9800 GTX would've dropped from 349 to 199 if it were not for AMD? NV could have dropped that price any time before that, but they didn't.

Likewise, AMD could've released the 4850 at 299 and the 4870 at 399 and still been competetive with the original 449/649 GTX 260/280 prices but they didn't. So ask yourself, who is really better for the consumer? It is AMD driving prices down.

Bottom line: nVIDIA has become arrogant over the past couple years in the lead and has been quite happy overcharging the consumer for that lead. We should be thanking AMD for this return to competition.


Tid bit
By NullSubroutine on 7/3/2008 11:19:08 AM , Rating: 2
R700 = RV770 chip time 2
RV770 = the chip in the 4800 series

You said RV770X2 which isnt a proper code name (might just be a typo).




RE: Tid bit
By NullSubroutine on 7/3/2008 11:21:47 AM , Rating: 2
I didn't mean to say RV770X2, just RV700 wasn't right (who isn't proofreading now?)


Blame shifting...
By StillPimpin on 7/3/2008 3:06:15 PM , Rating: 2
quote:
NVIDIA said that the chips and their packaging were made with materials that proved to be too "weak". NVIDIA passes the blame to notebook manufacturers, which it says contributes to the problem. Typically notebooks have poorer ventilation and components concentrated in a smaller space than


That sounds to me like they were using the same material to package desktop and notebook components, then turning around and blaming notebook manufacturers for using said notebook components in *GHASP!* notebooks. Please, NVIDIA engineers should have had enough sense to:

A) Recognize that notebook chipsets and GPU's have to withstand much higher temperature requirements than desktop components.

B) Validate said components in a notebook where temperatures could exceed normal operating temperatures.

Stop blaming others for your mistake NVIDIA.




RE: Blame shifting...
By Parhel on 7/3/2008 3:28:02 PM , Rating: 2
quote:
Please, NVIDIA engineers should have had enough sense to:

A) Recognize that notebook chipsets and GPU's have to withstand much higher temperature requirements than desktop components.


Is that just conjecture, or do you have some reason to think that? Because that sounds pretty far-fetched. It's not as if this is nVidia's first time out of the gate designing GPUs for notebooks.


Epic fail...
By n7 on 7/3/2008 11:19:07 PM , Rating: 2
A few things make more sense now.

HP has what i'd call a soft recall out on various models of their notebooks.
I don't remember all the models, but it involves the dv3000, 6000, & 9000 series, plus a few more.

There are various symptoms that quality for repair within 2 yrs. (even after the 1 yr. is expired)

But the common theme is that every one has an AMD CPU, uses an nVidia chipset , & the repair involves relacing the motherboard.
The most common issue is that the notebook will cease to POST.

I'd like to say it's uncommon, but it's not...happens a lot actually.

I suspect HP screamed pretty loudly to nVidia lately, as this has been going on for quite a while now actually.




RE: Epic fail...
By n7 on 7/4/2008 12:33:45 AM , Rating: 2
Ah here we go...here's the link mentioning all the models that can be affected.
http://h10025.www1.hp.com/ewfrf/wc/document?lc=en&...

I've seen many come through that are dead, or dying, all w/ AMD CPU + nVidia chipset.


Lets not forget the 7900 series GPU's...
By Reclaimer77 on 7/4/08, Rating: 0
RE: Lets not forget the 7900 series GPU's...
By Reclaimer77 on 7/4/2008 10:16:49 AM , Rating: 1
Yeah whatever rate me down fanboi's. You weren't the ones staying up all night trying possible fix after fix only to realize its simply a defective product.


RE: Lets not forget the 7900 series GPU's...
By DASQ on 7/4/2008 12:15:27 PM , Rating: 1
My 7950GT kicked the bucket after about 8 months.


By Shining Arcanine on 7/9/2008 12:35:35 AM , Rating: 2
Mine is still going strong.


Chyeah, riight.
By BruceLeet on 7/5/2008 6:50:23 AM , Rating: 2
I think we all know why they are predicting weak sales.

Also, their die size is just embarrassing compared, their fab process isn't efficient enough to be profitable. They are priced high for a reason, the huge dies, and with AMDs latest offerings forced them to cut prices and a impromptu die shrink doesn't help their profits.

Eh, I'll be building a new rig in this winter we'll see what Nvidia and AMD has laid out on the table by then.

In the meantime I'll be monitoring prices on the E8500 or Q9650 and OC @ 4.x and being from Canada during the winter its -30C average. I'll have my tower near a window put a couple 120mm fans in the case, oh yes, I'll hit 4Ghz easy I say 15C Idle and 25C-30C Load. Oh yes. /daydream




RE: Chyeah, riight.
By justjc on 7/6/2008 8:22:08 AM , Rating: 2
The recalled chip, that isn't mentioned in the article, is the nVidia GeForce 8500M as it can be read at http://www.dvhardware.net/article28363.html where the prolem also is explained a bit better.


Stock Purchase
By lukasbradley on 7/3/2008 3:14:53 PM , Rating: 2
I bought some this morning.




RE: Stock Purchase
By FaceMaster on 7/3/2008 7:47:15 PM , Rating: 1
BAD DECISION. You might have been better off buying the graphics cards themselves and trying to sell them off at a profit a couple of months later...


Oh how the mighty have fallen
By othercents on 7/3/2008 11:16:22 AM , Rating: 2
NVIDIA is great for the market always innovating, but the loss of the middle and low end markets will make it hard for them to compete. I can't wait to see what NVIDIA does in the next year too offset the losses.

Other




By allnighter on 7/3/2008 12:08:38 PM , Rating: 2
You may want to change your title a little bit.
nVidia did not post any earnings. They have issued a q2 profit warrning and lowered revenue expectations.
Ther's a pretty big difference between the two.
Not that it'll change aynthig for the stock today ;)




Its not just graphics.
By Mitch101 on 7/3/2008 12:26:40 PM , Rating: 2
Its not just the GPU's. Factor in the Mobo chipsets as well.

On the AMD side you have the 780G which is much better than Nvidia 7100 lineup on motherboards.

On the Intel side Intel is making the best motherboard chipsets in a long time and the NVIDIA high end mobo chipsets 780i are overpriced. Try finding one under $250.00. If you have to spend that much more you might as well get the next step up video card instead.

Lastly Intel still hasn't granted NVIDIA a license to produce chipsets for Nehalem. This is huge because benchmarks of Nehalem are pretty amazing and to not be able to sell mobo's with it.

NVIDIA bread and butter in graphics is getting beaten by ATI because of bang for buck.

NVIDIA is in the PS3. ATI is in the XBox 360 and Wii. The 360 is going to drop the price $50.00 soon and a new 360 is supposedly on the horizon. Not sure if it will be Blu-Ray or the Rumored Toshiba Return to HD media.

One small item of hope but I dont think it will be enough is that NVIDIA will be in the supposed iPhone killer however really what are the odds of selling enough of these to make a real difference since anyones marketing has yet to really push a true iPhone/iPod killer. Sure it might be better but better hasn't overthrown the iPod/iPhone devices or remotely come close yet. If that were true Sandisk would probably be king.

Like I said in another thread about NVIDIA - Even a god king can bleed.




By bupkus on 7/3/2008 12:51:46 PM , Rating: 2
I believe AMD has learned a very good lesson and has taken a page from Intel. That would be to shrink, shrink, shrink.
If you can shrink your die size you can reduce fab costs per chip, run cooler and faster.
And not just sizing to 55nm, but using fewer but more effective transistors and design.

I believe AMD had an opportunity to input to ATI that lesson and made a convincing argument to not lead just with advancing technology, but to address the market first.

Gaming consoles are delivering a more powerful and perhaps less expensive platform so gaming level video cards have broader competition.

Declining consumer confidence means greater difficulty in justifying diverting cash from filling one's gas tank to expensive gaming.

I'm also summarizing here what others have posted above.




When it rains it pours
By Griswold on 7/7/2008 3:38:10 PM , Rating: 2
Its nvidias turn to sail through rough weather for a change...




By sonofbc on 7/11/2008 10:29:27 AM , Rating: 2
Opps... Best not to count your chickens before they have actually hatched!!




Hey Nvidia
By SiliconAddict on 7/8/2008 1:35:00 AM , Rating: 1
Where's your talking-shit-about-Intel-press-releases now beech?




"And boy have we patented it!" -- Steve Jobs, Macworld 2007














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki