backtop


Print 51 comment(s) - last by just4U.. on Aug 26 at 2:19 AM

NVIDIA is sure to come out fighting

ATI is one of the biggest players in the discrete and integrated GPU market with rival NVIDIA. The two firms fight bitterly for dominance in the market and for the crown of fastest video card.

AMD has taken heat from analysts and investors since the purchase of ATI because the graphics unit has performed poorly for the most part. At one point in 2008, AMD was forced to take an $880 million write down related to ATI. AMD executives are feeling smug today with the announcement that ATI has taken the top spot in the discrete GPU market from NVIDIA for the first time since AMD purchased the company.

Jon Peddie Research (JPR) reports that the overall shipments of graphics cards for Q1 2010 were up 4%. However, shipments of video cards for the discrete desktop graphics market slipped 21.4% thanks in part to the massive growth of notebook sales. JPR reports that shipments are 38.6% above the same period last year for the market.

AMD posted the biggest gains in company history in both the discrete and integrated desktop products market. NVIDIA at the same time posted losses in shipments in all categories except its integrated notebook GPU business that grew 10%.

AMD had 24.4% of the GPU market for the quarter, Intel held 54.9%, and NVIDIA has 19.7% of the market. AMD reported that its revenue in the graphic segment grew 8% from the previous quarter and 87% when compared to the same quarter of last year with $440 million in revenue.

AMD claims that in the discrete graphics market it holds 51.1% of the market with NVIDIA having 44.5% of the market.





Comments     Threshold


This article is over a month old, voting and posting comments is disabled

NVIDIA is sure to come out fighting
By Amiga500 on 7/30/2010 9:39:24 AM , Rating: 1
Is it?

Long term, what is the future for Nvidia?
- Discrete GPUs? That is an ever shrinking market... soon APUs will integrate everything up to performance on the "CPU" die.
- Co-processors? Niche market, at best
- Workstation only cards? We see how that worked out for matrox, a now marginalised company on the GPU periphery.
- Low cost embedded? Eventually ARM will produce their own graphics extensions.

For some reason, AMD has a similar market cap to Nvidia. Long term, only one company has a clear road ahead.




RE: NVIDIA is sure to come out fighting
By LordSojar on 7/30/2010 9:51:13 AM , Rating: 2
quote:
Low cost embedded? Eventually ARM will produce their own graphics extensions.


ARM will what? Uh... ARM is an architecture, just as x86 is. The company ARM Holdings will NEVER go into that as a business venture. That isn't a logical move for them. I say that with 100% confidence.

nVidia's future lies in technologies like ION and Tegra (specifically Tegra). Tegra2 and Tegra APX have proved that nVidia can produce ultra mobile solutions that drastically outperform they competition.

Let's look at who competes with nVidia in this regard... Qualcomm? Don't be silly, Snapdragon's GPU is a pathetic joke in the tech world.

Samsung with their Hummingbird CPU, which while more impressive than the dismal Snapdragon platform, still leaves much to be desired. And do keep in mind, Hummingbird is almost identical to Apple's A4 SoC; just covering that as a base.

What does that leave? AMD's Bobcat? Yes, for ultra mobile netbooks and tablets, it will be an excellent solution, assuming they pull their Fusion program off correctly. For mobile phones, different story. Bobcat still is too high wattage and voltage for use in mobile phones. Tegra2 is honestly a bit too high for that, which is where the APX comes in.

nVidia is actually ahead of the bell curve in these regards; their products are too good for the current level of demand consumers put on products. With a die shrink, Tegra2 could easily be put into an HTC Android handset, and that would absolutely destroy anything available on the market today or that has been announced for early 2011 release. By destroy... I mean chew up and spit out like a poorly cooked steak.


RE: NVIDIA is sure to come out fighting
By Amiga500 on 7/30/10, Rating: 0
RE: NVIDIA is sure to come out fighting
By StevoLincolnite on 7/30/2010 12:05:57 PM , Rating: 3
quote:
Yet I look at Fusion and what do I see? an x86 CPU with graphics extensions...


You make it sound as if combining a GPU and a CPU that shares resources and Pipelines will spell the death of the GPU, well news flash!

GPU's use insane amounts of power these days, combine a processor and a Fermi and that's asking for trouble.

Then look at how large the die size could be, which will also impact yields driving up prices.

Then you will come to the realisation that you won't be able to have a high-end performing GPU built into a CPU that does share pipelines and resources, it's simply not feasible.

Then what about memory? I can't imagine them using System memory, and with 256 bit or more memory buses, that would actually add additional layers to the motherboards PCB because of the added complexity.

Nope, "Fusion" styled architectures are basically just a replacement for the IGP, they will only recieve better performance because the floating point instructions will be handled by the GPU's pipelines on the die which is very good at handling such instructions.

Basically you still won't be running Crysis with Eyefinity and all maxed out on a Fusion styled CPU.


RE: NVIDIA is sure to come out fighting
By MGSsancho on 7/30/2010 3:12:33 PM , Rating: 2
AMD never said Fusion was a replacement for high end discrete solutions. Also GPUs take up insane amounts of power because they currently have up to 1600 ALUs and they have high clock frequencies. I am sure AMD will do something similar to Nvidias approach by having the GPU part clocked higher than the CPU part on the same die as well as putting some of the GPU parts (ALUs, pipeline, cache, etc) where it makes sense. Both companies have a lot of smart people. Lets wait till their products ship before we pass too much judgment.


By muhahaaha on 7/30/2010 5:20:51 PM , Rating: 2
not sure how i missed your reply, but you took the words right out of my mouth.


RE: NVIDIA is sure to come out fighting
By muhahaaha on 7/30/2010 5:17:54 PM , Rating: 2
I prefer to save my judgment on Fusion for when it is actually out. AMD has some very bright people working for them and I wouldn't be surprised if they come out with a revolutionary new product. At worst, a Fusion APU will be lightyears ahead of intel IGP.


RE: NVIDIA is sure to come out fighting
By PrinceGaz on 7/30/2010 7:57:06 PM , Rating: 2
Whilst I hope that Bulldozer is as revolutionary as Clawhammer was, I can't see it beating Intel in the CPU department (Intel haven't gone down the wrong road of high clock speeds this time), though it should absolutely blow away any Intel graphics solutions. Unless Intel pull a real joker out the pack and come up with a totally unexpected revolutionary Extreme Graphics III which crushes anything AMD or nVidia have.


By bigboxes on 7/31/2010 12:46:35 AM , Rating: 1
Whilst? Seriously, stop with the ghey. Speak English and not some tongue of yor.


RE: NVIDIA is sure to come out fighting
By jconan on 7/31/2010 3:55:48 AM , Rating: 2
If AMD is capable of harnessing the GPU's processing power like the APU with 1600 ALU's and of course have them process x86 code, Intel will hardly be a match for AMD.


By JKflipflop98 on 8/1/2010 1:09:38 AM , Rating: 2
Yes, and IF frogs had wings they wouldn't bump their ass when they hop.


RE: NVIDIA is sure to come out fighting
By Spoelie on 7/30/2010 11:52:38 AM , Rating: 3
You're comparing an architecture to SOCs.

You essentially have 3 mobile GPU players:
* Qualcomm (Adreno)
* PowerVR (SGX)
* NVIDIA (Tegra)

While Adreno is surely the worst of the bunch, I haven't seen any evidence that Tegra is substantially better than SGX. Any hard numbers to back up your claims?


RE: NVIDIA is sure to come out fighting
By Ushio01 on 7/30/2010 4:12:21 PM , Rating: 3
RE: NVIDIA is sure to come out fighting
By ekv on 7/31/2010 2:14:44 AM , Rating: 2
My "wrong arm"? I suppose my other arm is developing punctuation if not an ARMv8-AE ...


By nafhan on 7/30/2010 11:22:14 AM , Rating: 2
I think the future for Nvidia will be devloping workstation and co-processor card technology for heavy lifting and then migrating that tech into highly integrated and probably ARM based SOC's along the lines of Tegra. They may be completely cut out of the middle ground unless they are able to continue to be a player in console systems.
If ARM does come up with their own graphics extensions, I think it'll be quite a while before those hypothetical extensions and the implimentations of them are up to the quality of ION and it's descendents, and even then ARM typically allows custom SOC designs.


By geddarkstorm on 7/30/2010 1:34:23 PM , Rating: 2
I'm just going to touch on one point as no one else has yet.

There is no way an APU can ever rival the power of a descrete graphics card, it isn't physically possible. Look at the size of a graphics card, all that's on it, all that goes into it, and then look at your CPU and mobo. Can you see the two merged? Just the heat issue alone means -no-.

The APUs are the lowest end integrated graphics. Yes, they are better than motherboard integrated graphics chips in -some- cases (looking at you, Intel), but that's because integrated graphics are just that abysmally horrible to begin with.

The shift from desktops more towards portable devices is the main reason that dedicated graphics card market didn't grow as much as expected. It's also tied to PC Gaming and whatever happens with that, and it doesn't help with all the card confusions and saturation that both Nvidia and AMD have put out there. But, APUs are nothing that'll ever have to be worried about for the discrete market; though they make the average consumer's life a lot easier. And hey, if you can SLI/Crossfire/Hydra with an APU and a discrete, that'd be pretty dang sweet.


More bad news this week
By Taft12 on 7/30/2010 10:52:52 AM , Rating: 2
This is going to get worse before it gets better - Nvidia just lost the iMac business to ATI.

Nvidia is not doomed, but it's not a good time to be buying Nvidia stock.




RE: More bad news this week
By LordSojar on 7/30/2010 11:07:21 AM , Rating: 1
quote:
This is going to get worse before it gets better - Nvidia just lost the iMac business to ATI. Nvidia is not doomed, but it's not a good time to be buying Nvidia stock.


Right... because Apple and their massive desktop market share will clearly make a... oh wait... what market share? This move doesn't really hurt nVidia at all .

The only semi popular Mac product is the Macbook, which is still powered by nVidia's mobile solutions. In that regard, ATi's mobile solutions are actually fairly lackluster. They consume too much power to be viable in any sort of portability sense. Their discrete GPU for desktops are amazing, as are their desktop IGP solutions, but... their laptop and netbook solutions are sub par. Yes, the Mobility 5850 and 5870 are great performers, but they are also battery vampires. Who buys notebooks for gaming?

Think about what you type before you hit submit people. Saying this and that is gloom and doom for X and Y company is nonsense. ATi is doing well, nVidia is fine as well. Both are gaining and losing ground in different areas.

nVidia has the contract to produce all the graphics chips for the Nintendo 3DS. ATi got the contract for the iMac lineup mid-year refresh. You win some, you lose some. The important thing is that both companies continue to thrive and compete against each other, driving performance per watt up, and costs down. Never forget that.

Oh, and feel free to down rate this comment too, since clearly before looking up performance/watt values for ATi's current Mobility 5xxx solutions versus nVidia's, you will just assume I'm wrong. Bets on how long those who blindly praise ATi and their products to rate this down to a 0 or -1? I'd give it 20 minutes.


RE: More bad news this week
By spread on 7/30/2010 11:25:42 AM , Rating: 2
Depends which ATi products you mean. Yes the mobile HD5850 are power hogs but compared to their nVidia counterparts they're pretty much the same. The only good nVidia mobile solution (power wise) is their optimus tech.

There's lots of people who buy gaming notebooks as portable desktop replacements. They're also good for doing design work.


RE: More bad news this week
By JKflipflop98 on 8/1/2010 1:12:21 AM , Rating: 2
Discounting 10% of the market is a stupid move.


RE: More bad news this week
By spread on 7/30/2010 11:21:45 AM , Rating: 2
And with the recent Rambus litigation... you can still make money by shorting nVidia stock.


RE: More bad news this week
By LordSojar on 7/30/2010 11:29:07 AM , Rating: 1
quote:
And with the recent Rambus litigation... you can still make money by shorting nVidia stock.


I respect Rambus for their technologies, especially their memory bus designs. But... what they are trying to pull with nVidia is insane. Our (the US) patent system is so terrible...

In this instance, I really hope nVidia's appeal goes through successfully and this injunction is ruled nullified. This would apply to any company being sued under these terms, Intel, AMD, or otherwise. Shame on Rambus.


RE: More bad news this week
By spread on 7/30/2010 3:39:33 PM , Rating: 2
Like all corporations, Rambus is simply trying to take as much advantage as possible. And they can because the patent system is a joke.


Late news again!?
By roostitup on 7/30/10, Rating: 0
RE: Late news again!?
By roostitup on 7/30/10, Rating: 0
RE: Late news again!?
By twhittet on 7/30/2010 10:28:46 AM , Rating: 3
Isn't 90% of the stuff on here just re-written from yesterday? I'm ok with that, since I don't have the time or energy to search every other site for individual articles.

Though I admit a little more quality in some of their re-writes would be helpful.....


RE: Late news again!?
By LordSojar on 7/30/10, Rating: -1
RE: Late news again!?
By LordSojar on 7/30/10, Rating: -1
RE: Late news again!?
By nafhan on 7/30/2010 10:58:42 AM , Rating: 4
[PUBLIC SERVICE ANNOUNCEMENT]

If you make a comment completely unrelated to the text of the article, your post will probably be considered a bad comment and may get rated down.


RE: Late news again!?
By JakLee on 7/30/2010 6:20:04 PM , Rating: 2
unless it's humorous, has a geek reference or an acceptable pop-culture reference.

And I for one look forward to our red ATI Overlords.


The future looks bleak for Nvidia
By Gungel on 7/30/2010 9:43:43 AM , Rating: 2
ATI will probably continue to increase its share of discrete graphics card. Nvidia didn't even release a full line of Fermi based cards yet and ATI is already sampling next generation Radeon HD6000 series (Southern Island)GPU's. The Fermi delay might be to much for Nvidia to handle. However, I hope they stay around a while longer to compete with ATI.




By just4U on 8/26/2010 2:19:23 AM , Rating: 2
MMM.. I don't know about that. Nvidia's 460 is a little killer card. They executed it's launch perfectly and it's flying off shelves because of it. Shows that Nvida has a fair amount of fight still left in them.

Personally I think the two companies will be fighting it out for years to come.


NVIDIA's long-term strategy
By HoosierEngineer5 on 7/30/2010 10:44:15 AM , Rating: 3
may be to take over the compute market. In raw horsepower, their graphics processors smoke general purpose CPUs. Now, if they can only get software engineers to think in parallel...

Intel has never been a leader, architecturally speaking. They go for market share by being conservative in their designs, and having an advanced manufacturing capability.

NVIDIA may yet catch them with their pants down (AMD did it once or twice, but once you kick the sleeping dragon...).

In the long run, the industry will benefit from the competition. It has been demonstrated repeatedly that Intel must not be a monopoly, or technological advancement will stall. They are not always 'paranoid' enough.




By JKflipflop98 on 8/1/2010 1:18:12 AM , Rating: 1
LOL wut? Intel's design has almost always been superior to the competition's. The only time that wasn't true was for the 3 or 4 years there where the Athlon64 was king. But those days are long gone.


/Runs...
By unplug on 7/31/2010 1:08:50 AM , Rating: 2
back to bad company 2 eyefinity...

/giggles




If only...
By LordSojar on 7/30/10, Rating: -1
RE: If only...
By Amiga500 on 7/30/2010 9:45:19 AM , Rating: 1
You "sir" are an obvious Nvidia plant, or retarded. Take your pick.

Within the same paragraph you say:

quote:
That's unacceptable. I bought a $425 dollar GPU, which was terribly overpriced.


quote:
Give me a break... if something is overpriced, then don't buy it.


The 5870 was not overpriced at all at launch. Indeed, considering its still effectively king of the hill at the exact same price, it was exceedingly good value at launch.


RE: If only...
By Drag0nFire on 7/30/2010 10:03:43 AM , Rating: 2
Agreed. Looks like an Nvidia fanboy.

But his comments on the drivers are valid. When the 5xxx series came out, I was getting all sorts of glitching, blank screens, and reboots. There's no excuse in my book for selling a card with half-finished drivers.

Also, when I "upgraded" from a 4850 to a 5770, I noticed some games got slower. Particularly GTA San Andreas, which slowed to a crawl. Given that the 5770 should be faster in every way, I have no explanation for this other than poorly optimized drivers...


RE: If only...
By BladeVenom on 7/30/2010 5:29:01 PM , Rating: 3
Microsoft's crash reports have proven that Nvidia is the one who makes the least reliable drivers. Most of Vista crashes were caused by Nvidia drivers. http://www.dailytech.com/NVIDIA+Drivers+Caused+Lio...


RE: If only...
By jconan on 7/31/2010 4:01:51 AM , Rating: 2
That's old news, and the drivers were beta not gold or WHQL certified that's why Vista crashed.


RE: If only...
By LordSojar on 7/30/10, Rating: 0
RE: If only...
By SlyNine on 7/31/2010 8:15:50 PM , Rating: 2
You're sporting a bit of a strawman yourself. Very few 5xxx cards have problems, and you cannot say for sure the problem is squarely on the card.

Meanwhile I own 3 different 5xxx parts and none of them have problems. You say look in to it. Why should I have to prove your premise. You prove that a lot of people are having this corruption problem and that it's their VIDEOCARDS fault.

I know I had 4 different 8800GT's 512MB (bought 2, both had to be replaced) and ALL of them overheated and crashed, I had to put an after market HSF on them to fix it.

As far as the 5870 being over priced, You're making the claim it is, so you should be the one defending your claim. Prove the negative man. Show me the "fab production costs, be they measured per chip or per spin". BTW I agree that it is currently overpriced, But then again the real price of something is always what people are willing to pay for it.

I'll take a single card over SLI any day, SLI scaling may be good, but until they use a unified frame buffer I say no thanks. Been down that road and didn't care for it.


RE: If only...
By Gungel on 7/30/2010 9:50:17 AM , Rating: 3
Unfortunately for Nvidia they make no money with Fermi. ATI has the upper hand when it comes to profit margin on the GPU side.


RE: If only...
By LordSojar on 7/30/2010 10:24:29 AM , Rating: 1
quote:
Unfortunately for Nvidia they make no money with Fermi. ATI has the upper hand when it comes to profit margin on the GPU side.


First part, untrue. Second part, true.

nVidia absolutely makes money on each Fermi chip sold, and it would be foolish to believe otherwise. When these supposed "tech" sites make estimates, they base those estimates on die size alone (which is a metric based on older fab + bin rate) Fab pricing changes dramatically based on the doping, and Fermi's die doping is cost effective, rest assured.

ATi's Cypress is also very very cost effective, which is why your second point (among other reasons) is correct. More specifically, the way that ATi engineered Cypress and Redwood and the way TSMC fabricates the pMOS of their dies lends a significant bin improvement to ATI's chips.

Unfortunately, nVidia didn't listen to advice given to them early in the development of GF100, and chose a slightly cheaper path, which ended costing them dearly. That's what you get when you ignore a chunk of the people devoted to analyzing the EM physics of the fab and comparing it with the proposed chip design telling you that the layout won't work as intended.


RE: If only...
By twhittet on 7/30/2010 10:34:01 AM , Rating: 2
The "making money on each Fermi chip" is probably still debatable, depending on how how you do the math. They wasted a ton of R&D time and money creating Fermi, and are losing marketshare. Sounds like a losing situation to me.


RE: If only...
By LordSojar on 7/30/10, Rating: 0
RE: If only...
By adl on 7/30/2010 11:48:52 AM , Rating: 1
i've read all your previous comments, and i'm quite impressed. your opinion is well balanced ... which is pretty unusual for this site :p.

infact, i'm impressed enough to ask you the following :) :

what would you consider to be a good single graphics card solution (primarily for gaming ... i'll be honest - my coding days are behind me. i am a bit curious about CUDA though ... even though i've got an ati bias) for say a 1920 x 1200 display? the rest of my rig isn't too high end (i'm running an E7500 with 4 gigs of RAM), but it is good enough.

i'd also be happy to have good linux support ... but wouldn't we all :p.

oh, and about the article ... i tend to prefer statistics like the steam hardware survey.

http://store.steampowered.com/hwsurvey/

i think it's a far better representation of the current market for gaming hardware .


RE: If only...
By LordSojar on 7/30/2010 12:16:29 PM , Rating: 1
quote:
what would you consider to be a good single graphics card solution (primarily for gaming ... i'll be honest - my coding days are behind me. i am a bit curious about CUDA though ... even though i've got an ati bias) for say a 1920 x 1200 display? the rest of my rig isn't too high end (i'm running an E7500 with 4 gigs of RAM), but it is good enough. i'd also be happy to have good linux support ... but wouldn't we all :p.


Honestly? A 1GB GTX 460 overclocked is an unbeatable deal at this point in time, at least until ATi lowers the prices on the HD5850 and 5870. If you have more money to blow, then the GTX470 or HD5870 become open options.

Given that your CPU will be your bottleneck in the majority of games (excluding your Crysis or Metro-esque titles), I'd say go midrange and save money (aka the HD5850 or GTX 460, and honestly I can't recommend a 5850 at its current price compared to the 1GB GTX 460) If you can snag an HD5850 for 250 or less, go for it. Otherwise, spend the 220-230 on the 1GB GTX 460 and overclock that thing to the moon. It will beat the stock HD5850 in nearly every game you can throw at it, especially when you turn up the anti aliasing.

Linux support is basically non existent for ATi GPUs. nVidia's Linux support isn't exactly superb either, but it's not terrible either. I run an Ubuntu and Gentoo 64bit installs with a workstation using nVidia cards. My primary desktop (this one) runs only Win7, as I attempted an Ubuntu install with this 5870... it didn't end well.


RE: If only...
By adl on 7/30/2010 1:52:03 PM , Rating: 2
hmm ...

the CUDA support is a bonus i wouldn't mind playing around a bit with ... and the prices for nvidia cards are always lower than their amd equivalents here in india anyways.

and as for the cpu bottleneck ... the (very few) games i'm interested in aren't heavily threaded anyway.

thanks for the help LordSojar ... i'll definitely consider your advice while buying.


"I'm an Internet expert too. It's all right to wire the industrial zone only, but there are many problems if other regions of the North are wired." -- North Korean Supreme Commander Kim Jong-il
Related Articles













botimage
Copyright 2015 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki