backtop


Print 51 comment(s) - last by just4U.. on Aug 26 at 2:19 AM

NVIDIA is sure to come out fighting

ATI is one of the biggest players in the discrete and integrated GPU market with rival NVIDIA. The two firms fight bitterly for dominance in the market and for the crown of fastest video card.

AMD has taken heat from analysts and investors since the purchase of ATI because the graphics unit has performed poorly for the most part. At one point in 2008, AMD was forced to take an $880 million write down related to ATI. AMD executives are feeling smug today with the announcement that ATI has taken the top spot in the discrete GPU market from NVIDIA for the first time since AMD purchased the company.

Jon Peddie Research (JPR) reports that the overall shipments of graphics cards for Q1 2010 were up 4%. However, shipments of video cards for the discrete desktop graphics market slipped 21.4% thanks in part to the massive growth of notebook sales. JPR reports that shipments are 38.6% above the same period last year for the market.

AMD posted the biggest gains in company history in both the discrete and integrated desktop products market. NVIDIA at the same time posted losses in shipments in all categories except its integrated notebook GPU business that grew 10%.

AMD had 24.4% of the GPU market for the quarter, Intel held 54.9%, and NVIDIA has 19.7% of the market. AMD reported that its revenue in the graphic segment grew 8% from the previous quarter and 87% when compared to the same quarter of last year with $440 million in revenue.

AMD claims that in the discrete graphics market it holds 51.1% of the market with NVIDIA having 44.5% of the market.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

If only...
By LordSojar on 7/30/2010 9:41:17 AM , Rating: -1
ATi could make drivers that actually had more functionality outside of "GAME GAME GAME" People buy GPUs for more than gaming, and Stream is a sad excuse for a GPGPU solution. As a matter of fact, Stream is some kind of cruel joke that ATi came up with to make fun of nVidia's own CUDA technology... but the jokes on ATi, seeing as how Stream is a useless, inefficient and sloppy harpy compared to the angelic CUDA.

I own an HD5870, using it as we speak. I still have artifact issues with my Samsung LED TV connected via HDMI or DVI to HDMI... after 7 driver revisions; I'm not alone in these troubles. That's unacceptable. I bought a $425 dollar GPU, which was terribly overpriced. Oh how things change with the ATi fanboys. It was always "boo hoo, NVIDIA is overpriced! QQ", but when ATi pulls the same garbage with their cards because they have a 7 month lead on nVidia, they ATi fans cry "ATi's pricing is fair! They need to make a profit!" Give me a break... if something is overpriced, then don't buy it. Fanboys (on both sides) should get a clue; both companies are in it for the money, and will tear your wallet to ribbons given the chance. They don't care about you.

ATi(AMD) seems to have a new curse; they make amazing hardware but the drivers that should be equally as amazing are the opposite... bloated piles of malfunctioning garbage that either fail to function or lack features that could be a marketable solution. Perhaps if ATi and nVidia fused into one company? Oh wait, no, that would be terrible, slash that suggestion from the ledger.

Granted, shame on nVidia for keeping CUDA specific to their own cards, as well as PhysX, but I digress, at least those 2 features are functional (though, PhysX as of this moment is still more of a marketing term than a heavily used standard). CUDA is amazing, if you've had the chance to use it in any extensive fashion... absolutely amazing, especially with the Nexus program functionality (aka Parallel Nsight as it is now called)

ATi will lose a lot of that market share this quarter, thanks to nVidia getting a clue about midrange solutions. The GTX460, based on GF104, is brilliant. It's what Fermi should have been from the start; a super scalar, condensed, power efficient yet monolithic chip that has tons of power, priced aggressively towards a very marketable segment. Only took them, what... 2 years, to find a cure to the stupid virus? Too bad ATi has already saturated the market with terrible cards like the HD5830 (I truly pity anyone stupid enough to purchase that disaster of engineering prowess)




RE: If only...
By Amiga500 on 7/30/2010 9:45:19 AM , Rating: 1
You "sir" are an obvious Nvidia plant, or retarded. Take your pick.

Within the same paragraph you say:

quote:
That's unacceptable. I bought a $425 dollar GPU, which was terribly overpriced.


quote:
Give me a break... if something is overpriced, then don't buy it.


The 5870 was not overpriced at all at launch. Indeed, considering its still effectively king of the hill at the exact same price, it was exceedingly good value at launch.


RE: If only...
By Drag0nFire on 7/30/2010 10:03:43 AM , Rating: 2
Agreed. Looks like an Nvidia fanboy.

But his comments on the drivers are valid. When the 5xxx series came out, I was getting all sorts of glitching, blank screens, and reboots. There's no excuse in my book for selling a card with half-finished drivers.

Also, when I "upgraded" from a 4850 to a 5770, I noticed some games got slower. Particularly GTA San Andreas, which slowed to a crawl. Given that the 5770 should be faster in every way, I have no explanation for this other than poorly optimized drivers...


RE: If only...
By BladeVenom on 7/30/2010 5:29:01 PM , Rating: 3
Microsoft's crash reports have proven that Nvidia is the one who makes the least reliable drivers. Most of Vista crashes were caused by Nvidia drivers. http://www.dailytech.com/NVIDIA+Drivers+Caused+Lio...


RE: If only...
By jconan on 7/31/2010 4:01:51 AM , Rating: 2
That's old news, and the drivers were beta not gold or WHQL certified that's why Vista crashed.


RE: If only...
By LordSojar on 7/30/10, Rating: 0
RE: If only...
By SlyNine on 7/31/2010 8:15:50 PM , Rating: 2
You're sporting a bit of a strawman yourself. Very few 5xxx cards have problems, and you cannot say for sure the problem is squarely on the card.

Meanwhile I own 3 different 5xxx parts and none of them have problems. You say look in to it. Why should I have to prove your premise. You prove that a lot of people are having this corruption problem and that it's their VIDEOCARDS fault.

I know I had 4 different 8800GT's 512MB (bought 2, both had to be replaced) and ALL of them overheated and crashed, I had to put an after market HSF on them to fix it.

As far as the 5870 being over priced, You're making the claim it is, so you should be the one defending your claim. Prove the negative man. Show me the "fab production costs, be they measured per chip or per spin". BTW I agree that it is currently overpriced, But then again the real price of something is always what people are willing to pay for it.

I'll take a single card over SLI any day, SLI scaling may be good, but until they use a unified frame buffer I say no thanks. Been down that road and didn't care for it.


RE: If only...
By Gungel on 7/30/2010 9:50:17 AM , Rating: 3
Unfortunately for Nvidia they make no money with Fermi. ATI has the upper hand when it comes to profit margin on the GPU side.


RE: If only...
By LordSojar on 7/30/2010 10:24:29 AM , Rating: 1
quote:
Unfortunately for Nvidia they make no money with Fermi. ATI has the upper hand when it comes to profit margin on the GPU side.


First part, untrue. Second part, true.

nVidia absolutely makes money on each Fermi chip sold, and it would be foolish to believe otherwise. When these supposed "tech" sites make estimates, they base those estimates on die size alone (which is a metric based on older fab + bin rate) Fab pricing changes dramatically based on the doping, and Fermi's die doping is cost effective, rest assured.

ATi's Cypress is also very very cost effective, which is why your second point (among other reasons) is correct. More specifically, the way that ATi engineered Cypress and Redwood and the way TSMC fabricates the pMOS of their dies lends a significant bin improvement to ATI's chips.

Unfortunately, nVidia didn't listen to advice given to them early in the development of GF100, and chose a slightly cheaper path, which ended costing them dearly. That's what you get when you ignore a chunk of the people devoted to analyzing the EM physics of the fab and comparing it with the proposed chip design telling you that the layout won't work as intended.


RE: If only...
By twhittet on 7/30/2010 10:34:01 AM , Rating: 2
The "making money on each Fermi chip" is probably still debatable, depending on how how you do the math. They wasted a ton of R&D time and money creating Fermi, and are losing marketshare. Sounds like a losing situation to me.


RE: If only...
By LordSojar on 7/30/10, Rating: 0
RE: If only...
By adl on 7/30/2010 11:48:52 AM , Rating: 1
i've read all your previous comments, and i'm quite impressed. your opinion is well balanced ... which is pretty unusual for this site :p.

infact, i'm impressed enough to ask you the following :) :

what would you consider to be a good single graphics card solution (primarily for gaming ... i'll be honest - my coding days are behind me. i am a bit curious about CUDA though ... even though i've got an ati bias) for say a 1920 x 1200 display? the rest of my rig isn't too high end (i'm running an E7500 with 4 gigs of RAM), but it is good enough.

i'd also be happy to have good linux support ... but wouldn't we all :p.

oh, and about the article ... i tend to prefer statistics like the steam hardware survey.

http://store.steampowered.com/hwsurvey/

i think it's a far better representation of the current market for gaming hardware .


RE: If only...
By LordSojar on 7/30/2010 12:16:29 PM , Rating: 1
quote:
what would you consider to be a good single graphics card solution (primarily for gaming ... i'll be honest - my coding days are behind me. i am a bit curious about CUDA though ... even though i've got an ati bias) for say a 1920 x 1200 display? the rest of my rig isn't too high end (i'm running an E7500 with 4 gigs of RAM), but it is good enough. i'd also be happy to have good linux support ... but wouldn't we all :p.


Honestly? A 1GB GTX 460 overclocked is an unbeatable deal at this point in time, at least until ATi lowers the prices on the HD5850 and 5870. If you have more money to blow, then the GTX470 or HD5870 become open options.

Given that your CPU will be your bottleneck in the majority of games (excluding your Crysis or Metro-esque titles), I'd say go midrange and save money (aka the HD5850 or GTX 460, and honestly I can't recommend a 5850 at its current price compared to the 1GB GTX 460) If you can snag an HD5850 for 250 or less, go for it. Otherwise, spend the 220-230 on the 1GB GTX 460 and overclock that thing to the moon. It will beat the stock HD5850 in nearly every game you can throw at it, especially when you turn up the anti aliasing.

Linux support is basically non existent for ATi GPUs. nVidia's Linux support isn't exactly superb either, but it's not terrible either. I run an Ubuntu and Gentoo 64bit installs with a workstation using nVidia cards. My primary desktop (this one) runs only Win7, as I attempted an Ubuntu install with this 5870... it didn't end well.


RE: If only...
By adl on 7/30/2010 1:52:03 PM , Rating: 2
hmm ...

the CUDA support is a bonus i wouldn't mind playing around a bit with ... and the prices for nvidia cards are always lower than their amd equivalents here in india anyways.

and as for the cpu bottleneck ... the (very few) games i'm interested in aren't heavily threaded anyway.

thanks for the help LordSojar ... i'll definitely consider your advice while buying.


"There is a single light of science, and to brighten it anywhere is to brighten it everywhere." -- Isaac Asimov

Related Articles



Latest Headlines
4/21/2014 Hardware Reviews
April 21, 2014, 12:46 PM
4/16/2014 Hardware Reviews
April 16, 2014, 9:01 AM
4/15/2014 Hardware Reviews
April 15, 2014, 11:30 AM
4/11/2014 Hardware Reviews
April 11, 2014, 11:03 AM










botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki