backtop


Print 97 comment(s) - last by sonofbc.. on Jul 11 at 10:29 AM

Storm clouds are gathering as NVIDIA faces a reinvigorated competitor

As the old saying goes, when it rains it pours.  NVIDIA was performing beautifully thanks to aggressive pricing and performance of its 8000 series of graphics cards.  It looked poised to leave competitor AMD (formerly ATI) in the dust.  However, the latest round in graphics war has marked a dramatic turnaround with AMD's 4850 and 4870 outperforming NVIDIA's offerings at a lower price

While NVIDIA still holds a tenuous grip on the highest end offerings, with its GeForce GTX 280 GPU, this might soon slip, depending on the performance of AMD's dual processor 4870 X2 (R700) card, likely coming in Q3 2008.  Meanwhile, NVIDIA faces challenges from Intel in its low-end and laptop graphics offerings, and from AMD's PUMA chipset/graphics package in the laptop market.

The economic repercussions of NVIDIA's slippage are already visible.  NVIDIA announced yesterday that it was going to turn in revenue of $875 million to $950 million for Q2 2008, which ends July 27.  This is significantly lower than the current analyst expectations of $1.1 billion.

That was not the end of the bad news from NVIDIA either.  It announced that it was facing a massive recall, due to overheating GPUs in notebook computers.  NVIDIA reported higher than average failures in both the laptop GPUs and in laptop chipsets.

NVIDIA said that the chips and their packaging were made with materials that proved to be too "weak".  NVIDIA passes the blame to notebook manufacturers, which it says contributes to the problem.  Typically notebooks have poorer ventilation and components concentrated in a smaller space than desktop computers.

The result of the recalls is that NVIDIA will be taking a onetime charge of $150M USD to $200M USD to cover the damages.  It plans to use the money to repair or replace defective parts.  It also hopes to collect part of the money from insurers it uses.  However, it has acknowledged its problems and switched the materials it uses.

The news has resulted in NVIDIA taking a beating on the stock market, sliding over 25 percent.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: Ow
By flipmode on 7/3/2008 1:11:14 PM , Rating: -1
Why don't people know the definition of monolithic?

From dictionary.com:
consisting of one piece; solid or unbroken

RV770 is monolithic - it's one piece of silicon. That's all I'm saying.


RE: Ow
By JasonMick (blog) on 7/3/2008 1:15:30 PM , Rating: 4
Umm...
From dictionary.com
mon·o·lith·ic
...
5. characterized by massiveness, total uniformity, rigidity, invulnerability, etc.

I guess you could call the RV770 in the sense its one piece, but its more frequently used to mean "big" these days. Using definition 5, the GT200 is more monolithic than the RV770.

Vocabulary is often like that... like how "pert" could mean healthy, but more frequently its used to describe an impertinent (rude/saucy) person. ;)


RE: Ow
By flipmode on 7/3/08, Rating: -1
RE: Ow
By Mitch101 on 7/3/2008 2:16:55 PM , Rating: 2
Jefe would you say I have a monolith of a gpu?

http://www.imdb.com/title/tt0092086/quotes


RE: Ow
By Bruneauinfo on 7/3/2008 3:12:51 PM , Rating: 2
LMAO!!

and while we're at it lets get caught up in some semantics.

apparently, ATI is lucky they use good materials.

and nVidia probably won't be buying AMD anytime soon.


RE: Ow
By mathew7 on 7/4/2008 1:51:44 AM , Rating: 2
In your quote the missing part is "high/top performance". So for high/top-performance cards the single-chip solution is over. Basically what they are saying is that they produce a mainstream chip and combine more of them for high-performance. But we all know how SLI/Crossfire does not double performance. The overhead of more chip management kills it (at least in present time titles).


RE: Ow
By afkrotch on 7/7/2008 5:32:01 AM , Rating: 2
Let's not forget the need for driver updates that provide profile setup for games. This is where I find the biggest flaw in multiple-gpu setups.

If you're game doesn't get a profile, you won't get the most performance out of the cards. AMD recently put in the profiles for Bioshock and The Witcher. Well I stopped playing Bioshock about 2-3 weeks after it released and I won't touch The Witcher with a 10 foot pole.

This is the whole reason I haven't bothered with a multiple GPU setup. Both strategies from AMD and Nvidia have their merits. I'm just with Nvidia on this one. I prefer having a single GPU.

Less hassle for me to watercool, less cables, no wasted slot, no need for some 8000w PSU, and so on.


RE: Ow
By TomZ on 7/3/2008 2:05:13 PM , Rating: 3
quote:
5. characterized by massiveness, total uniformity, rigidity, invulnerability, etc.

A "tech" site should stick with the "tech" definition, which is the one the OP references. Technicians and engineers in the field don't just mean massive, etc. when they say "monolithic." Your use is more of a layperson.


RE: Ow
By MamiyaOtaru on 7/4/2008 1:08:11 PM , Rating: 2
I'd say pert is more often used to describe jubblies


RE: Ow
By TheJian on 7/3/08, Rating: -1
RE: Ow
By flipmode on 7/3/2008 1:55:23 PM , Rating: 2
Whom are you responding to?


RE: Ow
By JasonMick (blog) on 7/3/2008 2:53:53 PM , Rating: 4
Someone had a busy day at the nvidia koolaid stand I think...


RE: Ow
By TheJian on 7/7/2008 2:57:09 PM , Rating: 1
My reply was to you (sorry flipmode). No koolaid involved. Nvidia already dropped their GTX280 to $459 at newegg (couple cards after rebate, and quite a few under $500). The GTX260 is now $329 at newegg. So what I said has already happened. They'll drop pricing to make AMD's cards worth less. They just did. You can expect more cuts the second 4870x2 comes out. By then 260/280 will have a die shrink just about out the door to easily allow this and add performance. I'm not saying I LIKE nvidia, I'm saying this is what's going to happen. Currently I'd buy a 4850/4870/GTX260 (toss up 4870/GTX260). But I'd also say ATI won't look quite so good after another cut from Nvidia. $460 isn't bad for king of the hill. That's $140 less than quoted in all these reviews of it. Quite a price cut in ONE month eh? The GTX280 doesn't look so bad now. Remember that we used to have $499/$599 cards to get top of hill performance. Right now that's only $460. That's a great buy all of the sudden. My point is, AMD has a great pair of cards, but their performance/buck was only great when GTX260 was $450 and GTX280 was $600. At $329 and $460 things change. The reviews should be updated showing this since it happened so fast.


RE: Ow
By carl0ski on 7/4/2008 12:25:00 AM , Rating: 2
quote:
Why don't people know the definition of monolithic? From dictionary.com: consisting of one piece; solid or unbroken RV770 is monolithic - it's one piece of silicon. That's all I'm saying.


Arguably
AMD Barcelona Quad Core and Intel Core 2 duo are monolithic
IBM Cell is monolithic.

however they are modular monolithic
they have the potential to disable part without rendering the entire device inoperable.
prior to being built they may also leave sections out of the construction phase.
ie Cell processors with 3 6 or 9 cores are available.

http://www.anandtech.com/video/showdoc.aspx?i=3341...
A SIMD core is very similar to NVIDIA's SM with a couple of exceptions:

1) There are more SPs in AMD's SIMD Core (16 vs 8)


In theory AMD can completely remove 4, 8 or 12 of those SIMD (SP) cores to make the device smaller, consume less power and far cheaper.

or better dynamically disable unused ones to conserve power.


RE: Ow
By Clauzii on 7/9/2008 9:55:52 PM , Rating: 2
Reg. CBE, the PS3 has 7 working cell-units, of which 6 is under user control, so there are 8 cores too :) And (You probably know) one core is NOT a CBE but a PPC acting more like the master control.


"It's okay. The scenarios aren't that clear. But it's good looking. [Steve Jobs] does good design, and [the iPad] is absolutely a good example of that." -- Bill Gates on the Apple iPad














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki