backtop


Print 120 comment(s) - last by Darkskypoet.. on Jul 20 at 5:27 PM

NVIDIA cuts price of GTX 280 and GTX 260 in response to pressure from ATI

ATI has been unusually competitive on both price and performance against NVIDIA’s GeForce GTX 280 and GeForce GTX 260 cards with its recently introduced Radeon HD 4870. In response to the pressure ATI has put NVIDIA under, NVIDIA slashed prices on its high-end GeForce GTX 260 and GeForce GTX 280 video cards.

When the NVIDIA GeForce GTX 280 launched it came to market at a price of $649 and its little brother, the GTX 260, retailed for $399. NVIDIA cut the price of the top-of-the-line GeForce GTX 280 to a more affordable $499 and the GeForce GTX 260 has dropped to a price of $299; which brings it close in line with the ATI Radeon HD 4870.

This will bring the price of the ATI and NVIDIA offerings closer together and could make it harder for gamers to decide what product to buy. ATI also has its high-end HD 4870 X2 dual GPU card with GDDR5 coming to battle NVIDIA’s GeForce GTX 280.

The new $499 price point for the GeForce GTX 280 should be very close to expected MSRP for the HD 4870 X2. Early numbers are in for the Radeon HD 4870 X2 and it compares quite favorably to the GeForce GTX 280.

NVIDIA’s GeForce GTX 260 was the subject of considerable amounts of ire from reviewers and enthusiasts when the Radeon HD 4870 was found to perform very similarly for significantly less money.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Well....
By Dianoda on 7/14/2008 11:54:30 AM , Rating: 4
I feel bad for all those people who paid the original asking price for either one of these cards.




RE: Well....
By FITCamaro on 7/14/2008 12:01:48 PM , Rating: 5
Agreed. But even with these prices, the 4870 is still better than the 260 and the 4870X2 is better than the 280. So ATI is still winning performance per dollar.

Honestly I wish I'd waited a few weeks for the 4850 to come out instead of getting two 8800GTS 512s. Would have better AoC performance and wouldn't have had to buy a new motherboard.


RE: Well....
By michal1980 on 7/14/2008 2:12:23 PM , Rating: 4
the only thing i'd disagree with you, is that a single GPU system will run nearly ALL games at its max performance. Where as the current crossfire setup is hit or miss.


RE: Well....
By dgingeri on 7/14/2008 3:01:55 PM , Rating: 1
Yeah, the 4870 does beat out the GTX260. That's why I bought one to replace my old 8800GTX. ATI/AMD is certainly winning the performance/dollar crown.

Now, if they'd just get their AA fixed, I'd be happy. Right now, I see no difference between 4X and their max quality 8X other than frame rate. That is one area nvidia wins at. My old 8800GTX did better at 2X than ATI does at 8X, with similar performance.


RE: Well....
By ChronoReverse on 7/14/2008 3:39:41 PM , Rating: 2
Huh?

ATI's current AA scheme is miraculous in performance and it does work. At least on my 4850 it does. I've made some screenshots of this showing the differences between the modes:

http://cid-1b1908eb94064033.skydrive.live.com/brow...


RE: Well....
By Hieyeck on 7/14/08, Rating: -1
RE: Well....
By GlassHouse69 on 7/15/08, Rating: -1
RE: Well....
By SavagePotato on 7/15/2008 10:33:42 AM , Rating: 2
If you look at the recent Hardocp reviews you will see the 4870 outperforms the GTX280 in age of conan.

You are talking about the odd person having problems running xp 32 bit with ATI hardware on age of conan, many many people myself included can run it just fine on ATI hardware. As well every review I have seen using age of conan on ATI hardware shows great performance. Time to get off of that dinosaur OS and get vista 64, where no one has problems running age of conan on ATI hardware.

Now let me also get this straight, you won't go for a 4870 because it's too power hungry? have you seen the wattage the gtx280 draws? The 4870 is an eco friendly flower in comparison.


RE: Well....
By WTFiSJuiCE on 7/16/2008 3:24:02 PM , Rating: 2
Yea, saying the 4870 is too power hungry when comparing to the GT200 series is laughable when you look at the giant behemoths those cards are atm.

And even if XP is a dinosaur OS, it still might outlast vista since 7 is slated at a '09 release ;).

Picture this: Jurassic Park 4, starring...Steve Ballmer holding a copy of vista, walking thru the raptor pen. Oh i would pay money to see that in IMAX. ;)


RE: Well....
By retrospooty on 7/14/2008 12:02:45 PM , Rating: 5
I dont - LOL! =)


RE: Well....
By Polynikes on 7/14/2008 12:08:40 PM , Rating: 2
I don't. They made a stupid decision. ATI's 4K series was clearly the better deal. Still is, if you ask me.


RE: Well....
By MrBlastman on 7/14/08, Rating: -1
RE: Well....
By Parhel on 7/14/2008 2:01:37 PM , Rating: 3
OpenFalcon? I'm not saying I doubt you, but that just illustrates how deep you need to dig to show nVidia coming out on top this time around.


RE: Well....
By MrBlastman on 7/14/08, Rating: 0
RE: Well....
By ChronoReverse on 7/14/2008 3:44:59 PM , Rating: 2
Isn't Falcon like DX6 or something? If it's only getting 70ish FPS even on an 8800GT, I'd say the problem isn't at the video card vendor's feet...


RE: Well....
By MrBlastman on 7/14/08, Rating: 0
RE: Well....
By ChronoReverse on 7/14/2008 5:03:10 PM , Rating: 2
Well, if it's DX7 then it's a "not necessarily". The way 3D was done back then is significantly different from how it's done today to the point that something as "simple" as Falcon in terms of 3D could be utterly bottlenecked by something fairly trivial.

This would also explain why it gets a dismal 70FPS on a 8800GT. And seriously, no matter how accurate the simulation was, that doesn't have anything to do with the GPU but rather the CPU.


RE: Well....
By MrBlastman on 7/14/08, Rating: 0
RE: Well....
By SlyNine on 7/14/2008 6:13:38 PM , Rating: 2
So in the words of old Bill G. What are you trying to say ?


RE: Well....
By ChronoReverse on 7/14/2008 6:26:55 PM , Rating: 2
It doesn't matter how "forward looking" the GPU utilization was back then. If it was using Direct3D then that's the limit it can use.

I've just looked at the release date for Falcon 4 and it was 1998. This means DX7 has not even been released yet.

Your claim that the changes from DX7 to DX8 were already quite suspect. A change from DX6 all the way to DX10 involves no fewer than 3 major architectural differences (T&L, Shaders, DX10 Batching and Unified Shaders).

To whit, there are in fact huge architectural differences such that bottlenecks that didn't use to exist now do. If the output of the game is correct, then it's not so much a driver bug but rather a game coded "improperly" (for a DX10 card).

And to retort your last comment. You're missing my point and arguing something is redundant just to push your own agenda.


RE: Well....
By MrBlastman on 7/14/08, Rating: -1
RE: Well....
By ChronoReverse on 7/14/2008 8:48:59 PM , Rating: 2
Well, I come from the FS_Open arena so it's not like I haven't seen code from the DX6 era ported to DX8 and even OpenGL.

Graphics-wise there's no reason anything with as primitive graphics can run so slow. Regardless of whether it's a 8800GT at 70FPS or a 4870 at 20FPS.

If you've bothered to read my posts instead of thinking I have an ATI bias you'd realize that's what I'm really trying to say.

Sure, it's possible for drivers to dynamically recompile so much that it'll run fairly decently. But that's indicative of a problem in the code rather than the drivers.


RE: Well....
By ChronoReverse on 7/14/2008 8:50:24 PM , Rating: 2
I just read up on Falcon 4 and it seems there's a Falcon 4: Allied Force that's DX9 compliant. So there's no excuse at all.


RE: Well....
By MrBlastman on 7/15/2008 12:35:12 AM , Rating: 1
Falcon 4 Allied Force is meerly a retail boxed (and semi-pilfered) version of Falcon 4.0 with Superpak 3 and pseudo BMS 1.02 compiled together.

Even though it is a retail product released within the last 3 years, OpenFalcon and RedViper completely surpass it in every area except perhaps online stability(and OF even is beginning to surpass it there also if not already).

Otherwise, Allied Force is a lite version of Falcon with quite a bit of features that OF (realism wise) and RV (graphics wise) add over it. It is behind the times. It is a good flavor if you don't want to mess with modding the sim since you install, patch and play - but, you are missing out on quite a bit.

As far as Direct X support... You can have DX 9 support in the build, but you might not neccessarily use the features. EECH (another modded helo sim) is built using the DX 9 library, but does not support any of the features at all so even if you tried to wrap something on top of it such as this neat mod:

http://boris-vorontsov.narod.ru/index_en.html

Which, in any DX8 or 9+ enabled game using the feature set, will add HDR/Bloom support plus more just by dropping in the .dll to the game directory. Falcon in all flavors plus most other sims fail the test and it will not work with it.

So you see, we're stuck with what we have until someone implements the feature set. :(

I assume you mean by FS_Open the Freespace SCP project originated on hard-light. I've used that mod for a few years now and in order to even use that right you have to run it in OpenGL mode. ;) I love the mod, they've done a heck of a lot of fine work.


RE: Well....
By SavagePotato on 7/15/2008 10:39:53 AM , Rating: 2
That's all great but here is my point of view.

Who gives a rats ass? the 12 people playing an ancient game that are worried about their performance with it?

You will have to excuse me for saying who cares, when it comes to driver optimization for a god knows how many years old open sourced game.


RE: Well....
By MrBlastman on 7/15/2008 11:03:25 AM , Rating: 2
The community is huge - enormous in fact. The only reason people play it is nothing on the same scale of it has been developed commercially in 10 years.

My point as was, is that the only reason I use an Nvidia card is that the drivers are more "friendly" to older apps.

It isn't open sourced either... The code was "leaked" and much to LP's chagrin (the IP holder) is still being developed on. A soap opera I'm sure you don't care to know any more about.

Go use your card, I'll use mine. For some of us who depend on backwards compatibility, we'll keep using what works best to suit our needs. I'm not complaining about performance, just pointing out the ATI Zombies that all is not perfect in Camelot. Same holds true for Nvidia.

Obviously you cared enough to respond to my post. ;)


RE: Well....
By SavagePotato on 7/15/2008 12:03:33 PM , Rating: 1
When I got my 8800gts 512 it had driver issues with certain apps I ran.

The bleeding edge hurts, something everyone has to get used to with hardware.


RE: Well....
By FingerMeElmo87 on 7/14/2008 12:12:44 PM , Rating: 2
"nVidia was a bad choice." - Ron Burgundy 07/14/08


RE: Well....
By DASQ on 7/14/2008 12:15:07 PM , Rating: 4
It's hard to picture Ron Burgundy drinking a graphics card.


RE: Well....
By FingerMeElmo87 on 7/14/2008 4:50:50 PM , Rating: 2
you d*ck. lmao


RE: Well....
By del on 7/17/2008 9:15:14 PM , Rating: 2
I feel worse for the people who already bought a different card instead of buying the GeForce GTX 260 for $300 (namely me!) >_<


don't you mean "take them to the mat"?
By RamarC on 7/14/2008 12:42:44 PM , Rating: 5
"Time to go to the mattreses/Take it to the mattresses" means you're hiding out because you're at war and don't want to get snuffed while you sleep at home or you're trying to snuff a guy while he sleeps.

"Take them to the mat" means you're ready to ground and pound. No jabbing, no fancy footwork... you're ready for a brutal fight.




RE: don't you mean "take them to the mat"?
By JonnyDough on 7/14/2008 1:42:00 PM , Rating: 2
I thought "take it to the mattresses" might mean the same thing as "get a room." But then, it could also just mean you need to go wrestle on a bed so you don't hurt yourself. Or, it could stand for "go hide your goods inside your mattresses, so the Nazis don't get them."

Regardless, at this point Nvidia is likely wishing they could AMD's 4870 under a mattress to keep educated consumers from buying it.


By JonnyDough on 7/14/2008 5:46:57 PM , Rating: 2
*hide


RE: don't you mean "take them to the mat"?
By Digimonkey on 7/14/2008 1:50:22 PM , Rating: 3
as quoted from the godfather:

quote:
No, no, no! No more! Not this time, consigliere. No more meetin's, no more discussions, no more Sollozzo tricks. You give 'em one message: I want Sollozzo. If not, it's all-out war; we go to the mattresses.


RE: don't you mean "take them to the mat"?
By stryfe on 7/14/2008 5:03:09 PM , Rating: 2
What Sonny means when he says "we go to the matresses" in the above quote is that they'll put their enemies to "sleep" and certainly not by singing them a lullaby.


By Don Corleone on 7/14/2008 8:25:13 PM , Rating: 2
When the mob talks about "going to the mattresses" it means the soldiers holing up in apartments with maybe a dozen men in one apartment or house. You need more beds so they brought in mattresses.


RE: don't you mean "take them to the mat"?
By MrBungle123 on 7/14/2008 1:51:40 PM , Rating: 5
The title should read say "ATI HD 4000 series bends NVIDIA over the mattress".


By Clauzii on 7/14/2008 10:34:18 PM , Rating: 2
G.A.D.-ATI...


problems with the price
By tastyratz on 7/14/2008 1:26:17 PM , Rating: 2
Don't forget,
video cards today are not the video cards of yester year.
They used to be only "3d accelerators" instead of "gpu"'s

The video card today takes on far more responsibility in the computers role than it used to, and it was about that time we started seeing the price increases.

People had no problem blowing 500+ on a quad core, yet they complain about video cards that have a far higher transistor count, memory, and complexity.

2000-2001: 3dfx voodoo 4 had a 14 million transistor count and the amd duron 600mhz cpu had 25 million.
The duron was about 50 bux, and the voodoo 4 4500 was 179.

2007,
qx6800 transistor count 582 million, msrp was $500+(didn't find an exact figure quickly)
Amd hd2900 xt 700 million transistor count, msrp 449.

A much higher transistor count for something that also includes cutting edge next gen memory and a lot more circuitry on board? Thinking about it that way the price sounds like a steal. High end computing is getting more expensive but it is also getting a lot more expensive to produce and you are getting a lot more for your money.

Would you be upset if one of them released something now that's twice as fast for twice the price because you cant justify spending that? What if they didn't offer it but video games had half the eye candy?

A mid level reasonably priced graphics card can get you playing every game out there now, just not at obnoxious resolutions with all the eye candy cranked up. If you want all the extras game developers put in be prepared to pay the extra money for the hardware.




RE: problems with the price
By peldor on 7/14/2008 2:28:54 PM , Rating: 2
quote:
The video card today takes on far more responsibility in the computers role than it used to


Not really. The $50 version of either company's offerings is largely indistinguishable from the $500 version outside of games and 3d graphics programs.

Now, I suppose it could take on far more "responsibility" if everything was generally compiled to run on a programmable GPU when available. That day is not today however.


RE: problems with the price
By winterspan on 7/14/2008 5:50:42 PM , Rating: 2
"taking on more responsibility" has nothing to do with general computing.

He's referring to the 3D graphics pipeline, AKA starting with hardware transform & lighting with the Geforce -- when is when graphics processors started becoming known as "GPUS".


RE: problems with the price
By Yames on 7/14/2008 3:07:46 PM , Rating: 3
I think the point is that you pay the same or less for the top of the line processor as you did 10, even 20 years ago. This trend did not follow for the video card industry.


NAMES NAMES NAMES!!!
By JonnyDough on 7/14/2008 5:42:32 PM , Rating: 1
quote:
NVIDIA Takes ATI to the Mattresses with Lower Pricing


When are we going to start calling them AMD? ATD? AMi? AAMTDI? Sheesh! What the hell is the name of this company anyway?

ATI GOT BOUGHT.

RIGHT? I'm just so confused. Is anyone else? Shouldn't a company go by one name?




RE: NAMES NAMES NAMES!!!
By Clauzii on 7/14/2008 10:37:41 PM , Rating: 2
They obviously decided to keep the already good brand names. AMD=CPU, ATI=GPU, AMD/ATI=Chipsets. Even AMTI doesn't sound good..


RE: NAMES NAMES NAMES!!!
By jevans64 on 7/16/2008 10:10:01 AM , Rating: 2
DAAMIT has a nice ring to it. LOL

Good for them in offering a good performance GPU for a very attractive price. I currently have an HD 3850, HD 4870, 8800 GTX, 8800 Ultra, and GTX 280. LOL


RE: NAMES NAMES NAMES!!!
By Kim Leo on 7/17/2008 1:08:51 PM , Rating: 2
They should make a sticker with the AMD and ATI logo merged with the text DAAMIT under it :D

ATI sure pulled a nice one :) I got the HD4850 and it's just awesome in performance!

So it's a graphicscard collection you are looking for?

HD4850, 7600GT, many FX5200, 4 Voodoo5 5500 PCI(never unpacked), a Voodoo5 5500 AGP, 4 Voodoo4 4500, Voodoo3 3500, Voodoo3 3000(never unpacked ;)), and 2 Voodoo3 2000 (and like a million voodoo2's and 1's :P)

well ok I admit yours is probably a bit more glamourus :P


RE: NAMES NAMES NAMES!!!
By JonnyDough on 7/18/2008 5:36:54 AM , Rating: 2
Dear insolent down-rater,

I don't know who rated me down but I had a legitimate question. Maybe you don't like my use of the word "Hell" and if that's the case, feel free to visit there frequently. I've seen tech/comp sites use both AMD and ATI and I would appreciate it if AMD/ATI would clarify which is to be used for what.

"AMD's graphic division..."

Is that ATI? Or is it AMD's graphic division? It's all a bit annoying to me. Sorry if you're fine with it, but a man should be able to ask a question without being rated down just because you don't care.


Competition, glorious competition
By Aloonatic on 7/14/2008 11:56:43 AM , Rating: 2
Great to see price drops and so soon, unless you bought a card (or 2) at launch but then you've probably got enough money in your bank account to be able to take it on the chin, but still, must be annoying.

I still haven't gotten around to building my new rig (my 2.5GHz non HT P4 is struggling away now :D) but I look at motherboards and I often wonder whether Nvidia would be better of allowing Intel to make chip sets which are Sli capable?

I know that a lot of people don't really like the Sli/Crossfire thing, or see it as a viable upgrade path but does anyone else think that Nvidia mite sell a few more cards if they let people with Intel based boards operate in Sli mode?

Or would they loose more on their mobos than they would gain?




RE: Competition, glorious competition
By othercents on 7/14/2008 1:49:12 PM , Rating: 2
My understanding is that SLI is an open standard and anyone can make a motherboard chipset that will run SLI. However Intel partnered with ATI for Crossfire instead of nVidia because nVidia was closely partnered with AMD at the time. Plus there really isn't an advantage for Intel to build SLI boards since SLI is considered a top end configuration where only 3% or so would ever consider that type of build.

SLI might also be harder to implement than Crossfire.

Other


By Darkskypoet on 7/20/2008 5:27:35 PM , Rating: 2
Actually no. Crossfire is open... So no licensing fees, etc need be paid at all to ATI/AMD for implementing. This is not the case for SLI. Nvidia uses SLI to sell chipsets... Notice even with x58, mobo mfg's will have to implement an Nvidia PCIE bridge to enable SLI.

If you follow AMD's tactics with tech like this, they choose (as the little guy) to usually give away the ability to implement them for nothing in order to boost industry acceptance. You'll see this with Crossfire, and Hyper Transport, 3DNOW, etc.

As well, I think (but am not sure) that Crossfire really doesn't require much more then the properly available and wired PCIE slots matched with 'enough' lane width to carry the data.


So...what card will you buy?
By topcat903 on 7/14/2008 1:02:31 PM , Rating: 2
Personally I would buy the 4870. I'm glad they didn't come out with those cards and try to price them around what Nvidia was pricing their cards. This goes to show that the cards shouldn't be that expensive to begin with but Nvidia thinks they can milk us for all they can until ATI changed that.

I know competition brings prices down and since Nvidia had no competition for awhile they could set the price high. But I also believe in reasonable pricing too.




RE: So...what card will you buy?
By V3ctorPT on 7/14/2008 1:30:36 PM , Rating: 2
I bought the 4870... I was the first in my country to get is hands on it... and I'd still buy it... DX10.1 (although it may be useless with DX11 around the corner), GDDR5... good power consumption (I had a 2900XT) and I play everything to the max... except crysis, but I don't play it... This time ATi won... but is nice to see nVidia coming down of their pedestal and getting the prices lower, I wish they could limit their top graphics cards to 400$...


RE: So...what card will you buy?
By DeuceHalo on 7/14/2008 4:26:23 PM , Rating: 2
I don't think it matters - the consumer wins this round :)

Getting ready to build a new rig in August myself - looks like I'll have to finally budget in a monitor that does more than 1280x1024.


!ATI
By Andypro on 7/14/2008 2:03:34 PM , Rating: 1
s/ATI/AMD/g.

Come on, folks. The merger has been final for well over a year now. There is no corporation named ATI anymore. Now it's just a division.




RE: !ATI
By dflynchimp on 7/14/2008 7:40:39 PM , Rating: 2
It's still ATI for me... if only for the nostalgia


RE: !ATI
By elpresidente2075 on 7/14/2008 7:50:01 PM , Rating: 2
They may be the same company, but they still are able to be referenced as different parts of that same company, such as Chevy and GMC are both part of GM.

Besides, AMD still calls them ATI:
http://ati.amd.com/products/home-office.html


Don't you mean MATT ? ( not mattresses )
By phxfreddy on 7/14/2008 3:41:26 PM , Rating: 2
Its a wrestling analogy. Its Greco Roman wrestling. Not wrestling until Greek. It does not include a fruity interlude on any mattresses.




By sotti on 7/14/2008 5:13:54 PM , Rating: 2
um no, note the picture of Don Corleone?

It' a quote from The Godfather. If you didn't get the quote proceded directly to the movie store (netflix) and get godfather 1.


Quit whining about high end costs
By tedburton on 7/14/2008 6:53:40 PM , Rating: 2
It's really silly to whine that there's an uber high-end GPU that you can't afford. The uber high end GPU is for someone else who has the money to spend on a crazy 600mm^2 die (or two crazy 260mm^2 dice). His reward for paying for such outrageous technology is getting to look down his nose at you (and me).




By SlyNine on 7/14/2008 10:23:30 PM , Rating: 2
He better get 2 then, Because my 8800GT SLI configuration is still faster in many, many games. and was cheaper 6 months ago when I got it.


Pricing
By JonnyDough on 7/14/2008 1:34:35 PM , Rating: 1
By pricing the GTX 280 at the expected price of the 4870 X2, NVidia is telling consumers that the card is equal. Had they priced it a bit lower, they would actually sell fewer cards, since both cards at that price are aimed at those seeking performance who are less wary of the price.

If AMD wants to sell more X2's they will up the price a few bucks, and then work their butts off on optimizing the drivers. If you beat the king by a hair and you can have your cake and eat it too.




RE: Pricing
By SlyNine on 7/14/2008 10:26:56 PM , Rating: 2
Pretty sure anyone who is wise enough to build their computer and put one of these in would know witch one is right for them.

however the people that you describe, I'd picture going to Dell/Alienware and just chooseing whatever they recommened.


Pricing...
By cscpianoman on 7/15/2008 12:04:12 AM , Rating: 2
I would dare to venture that AMD has a lot more wiggle room in terms of pricing than nVidia does. AMD is on a much smaller process and can get more chips per wafer, while nVidia is stuck with a behemoth. I bet AMD can lower their prices by a pretty large margin and still be profitable with the 4000 series.




ATi takes Nvidia to the mattress
By rupaniii on 7/17/2008 10:12:35 AM , Rating: 2
Yeah, sorry, but nVidia dropping it's prices to comparable levels and still being outperformed by ATi is not an example of nVidia catching ATi. ATi is killing Nvidia profit expectations and marketing strategy. It didn't even have a marketing strategy. They pretty much thought they could go, NEW NVIDIA CARD, YOU MUST BUY NOW.
Eh, didn't work that way guys, thanks for shopping.




!ATI
By Andypro on 7/14/2008 2:03:01 PM , Rating: 1
s/AMD/ATI/g.

Come on, folks. The merger has been final for well over a year now. There is no corporation named ATI anymore. Now it's just a division.




Little late...
By jarman on 7/14/2008 2:17:01 PM , Rating: 1
This was news last week...




How much does it really cost?
By spec74 on 7/14/2008 2:31:23 PM , Rating: 1
With all these crazy prices for the latest GPU cards. How much does the company cost them to make one anyways? seems like they're inflating like 5000% to make money

(ie) Nike costs them about 2-3 bucks to make a pair of nikes during the 90's (that's what I know around then)




I am glad
By MrBlastman on 7/14/08, Rating: -1
RE: I am glad
By killerroach on 7/14/2008 12:00:57 PM , Rating: 4
I long for a day when we can once again purchase top video cards for between 100 - 200 bucks and get top level quality.

Once again? When was that? I remember paying $400 for my GeForce2 Pro when it came out... I can't remember a high- or near-high-end card in that price range since possibly the Voodoo Banshee. If anything, we're seeing a better return for a $200 video card now than in the past...


RE: I am glad
By MrBlastman on 7/14/2008 12:07:46 PM , Rating: 5
My Monster 3d cost me 150 bucks or so for the time. My #9 s3-virge (oh wow talk about power!) cost me 100 bucks.

Now, those are pretty pathetic compared to todays stuff, but back then they were cutting edge. My Paradise VGA card (back in 1988) cost me 80.00 - it was totally cutting edge for the time.


RE: I am glad
By Jackattak on 7/14/08, Rating: -1
RE: I am glad
By MrBlastman on 7/14/2008 3:20:05 PM , Rating: 3
Technology, unlike most logical things in life - becomes less expensive as time goes on.

Knowledge and Science expand at an exponential rate, not a linear one.

At a minimum, you might inflate these prices by the rate of inflation, but even then technology does not neccessarily follow that same rate. For instance - in 1990 I upgraded to a 386 DX 33 with 4 megs ram, 80 gigs hd, 2400 bps external modem, Sound Blaster Pro and a 1 meg Trident VGA video card (great card for the time). It cost about 2100.00. That was pretty darned good performance for the time for gaming.

Now, you can get a Q9450, 4 gigs of DDR800 RAM, X-Fi Fatality with front panel and a 260/3000 series video card for about 1200 or so dollars. This system has really darned good performance as well.

So, no, really given the trend of technology, you should pay less via historical figures, not more. Manufacturing techniques improve with each generation, as does the knowledge to produce the next step compound upon past discoveries. So even here, at best you might see inflation kick in but in reality, the trend is _still_ counter that of inflation.


RE: I am glad
By lotharamious on 7/14/2008 4:24:36 PM , Rating: 2
quote:
For instance - in 1990 I upgraded to a 386 DX 33 with 4 megs ram, 80 gigs hd, 2400 bps external modem, Sound Blaster Pro and a 1 meg Trident VGA video card (great card for the time).

80 gig??? I bought a 4 gig drive in 1996. Then I upgraded to a 14.4k modem. BLAZIN'!!!!


RE: I am glad
By MarkHark on 7/14/2008 4:32:06 PM , Rating: 2
I'm sure he meant 80 megs. I've made the same mistake more than a couple times.


RE: I am glad
By MrBlastman on 7/14/2008 4:40:04 PM , Rating: 2
Sorry, I meant 80 meg. Touche.

I got a 2 gig drive in 95 - and I thought that'd be all I'd ever need... boy was I wrong.


RE: I am glad
By lotharamious on 7/14/2008 4:45:25 PM , Rating: 2
quote:
Sorry, I meant 80 meg.

I know what you meant. I was just giving you a hard time. Heck, my first computer had a 512 meg HD. That was in 95. The term "gig" was still in its infancy. Wow, we've come so far.


RE: I am glad
By PrinceGaz on 7/14/2008 5:12:43 PM , Rating: 2
We certainly have- 'gig' is now its final year or two for describing the capacity of traditional (not SSD) desktop hard-drive; we'll be routinely using 'tera(s)' soon, already the sweet-spot for desktop HD capacities where you get the most capacity per price is around the 500-750GB region and sometime next year it is sure to be at least 1TB.


RE: I am glad
By rollakid on 7/14/2008 8:26:45 PM , Rating: 2
Talk about old HDDs, I always chuckle whenever I put my 4gig pendrive and my first HDD, which is a 1.5gig (and still working).

Then the idea of high density micro SDs pop into my mind...


RE: I am glad
By Clauzii on 7/14/2008 8:30:21 PM , Rating: 2
My first DH is a 20 MB (Seagate I think) sitting comfortably in a IBM Personal System 2/ Model 30, 8 MHz 8088. Works till this day. Great for those TOTAL oldschool games :)


RE: I am glad
By Clauzii on 7/14/2008 8:42:48 PM , Rating: 2
That should be HD, not DH.


RE: I am glad
By Captmorgan09 on 7/14/2008 10:25:20 PM , Rating: 2
Must of been nice to have a hard drive.... I had dual 5 1/4" disk drives with CGA graphics in DOS.


RE: I am glad
By Clauzii on 7/14/2008 10:32:04 PM , Rating: 2
It is. The floppy is 3.5" though. I don't have a working 5 1/4" :( (Only for my C64 :))


RE: I am glad
By Spyvie on 7/14/2008 12:26:04 PM , Rating: 2
I bought a Rendition Verite card when the price finally dropped to $99, and promptly boosted my frag count in QW with all those fancy Open GL colors.


RE: I am glad
By kmmatney on 7/14/2008 12:29:03 PM , Rating: 2
My Ti4200 was $120 and could play any game out comfortably at the time.

My original Radeon LE (which could be bios flashed to a real Radeon) was $65 and could play any game out comfortably at the time.

My Voodoo 1000 was $45 and could play any game out comfortably at the time.

I spent $89 on my Banshee - OK that wasn't the greatest card, but I could "almost" play every game I owned at comfortable settings.

So yes, you used to be able to spend well under $200 and get a video card that could play all the latest games. However I do think that the HD4850 and 8800GT are now at nice prices. I'm tired of seeing $300+ video cards.


RE: I am glad
By masher2 (blog) on 7/14/2008 6:35:04 PM , Rating: 1
> "So yes, you used to be able to spend well under $200 and get a video card that could play all the latest games"

You still can. Just turn down the settings a bit...the game is still going to look far, far better than anything you saw on your old Ti4200. There isn't a game made that absolutely requires a top of the line card. Publishers aim for the sweet spot in sales.

> "I'm tired of seeing $300+ video cards"

So don't buy them? I don't know why simply having the option to do so would annoy anyone so much.


RE: I am glad
By ChronoReverse on 7/14/2008 6:42:33 PM , Rating: 2
With the existence of the 4850 (and the 9800GTX if you really want Nvidia), that's not much of an argument.

We have high performing cards for great prices. Even Crysis is playable on medium-high (or higher if you tolerate lower resolutions). And this is despite how Crysis has something seriously wrong considering the performance in comparison to what you get in graphics.


RE: I am glad
By Clauzii on 7/14/2008 8:26:52 PM , Rating: 2
What was the difference of VooDoo 2 and Voodoo2 1000?


RE: I am glad
By jabber on 7/15/2008 5:41:09 AM , Rating: 2
I think if I remember right..

The Voodoo2 was a 3D add on card that required another 2D graphics card to work with. A Matrox Millenium 2 was the prefered choice I think.

The Voodoo 2 1000/2000/3000 etc. was a fully integrated 2D/3D offering like the Banshee.

Not sure if it wasnt called Voodoo 3 though. Too lazy to look it up I guess.


RE: I am glad
By Clauzii on 7/15/2008 4:06:03 PM , Rating: 2
Thanks :)

OK, I'll get it, they probably had some "VooDoo2 - 1000" version. I remember 2000 and up. My friend have exactly that Matrox and two Voodoo cards. Allthough there are by different names, they look 100% equal by components and even though the Creative driver don't say SLI, Need For Speed runs very nice in 1024x768 :))


RE: I am glad
By Jedi2155 on 7/14/2008 1:42:25 PM , Rating: 2
My Voodoo 3 3000 AGP was $120 back in November '99 which was only 6 months after release, and quite competitive with the TNT2's of the day for the top of the line market. I do recall the Voodoo 3 3500 with the TV tuners were going for around $190 as well, under $200.


RE: I am glad
By Denigrate on 7/14/2008 4:40:44 PM , Rating: 2
Voodoo 3 3500. Wow, that brings back great memories of UT and Starcraft! What a great card, which I got at a massive discount at a Hastings because of a pricing error, I was sad when 3DFX bit the dirt. My next card, an NVIDIA TI 4200, sure served me well. As I was lazy, I'm suffering along with an 8600GT that came with my first non-home brew computer, a Dell Inspiron.


RE: I am glad
By Locutus465 on 7/14/2008 1:49:23 PM , Rating: 2
? I remember $199 being the sweet spot for very well performing 3D acelorator for a long time. All the matrox cards I owned (early G200 era when they were still competitive) were around there, as well as my S3 Virge products which were the first 3d cards I've ever owned or remembered seeing on the store shelves at best buy.


RE: I am glad
By RamboZZo on 7/14/2008 1:51:20 PM , Rating: 2
I think I paid something like $149 for my First Riva TNT. Outrageous when the Voodoo 1 I had before it was something like $99. When I bought a Diamond Viper TNT 2 I thought it was downright criminal that they were charging $199 for it! All of those were absolute top of the line cards at the time. That was also about the time I realized every subsequent generation was going to creep up in price to to insane levels. It's never going to happen but boy would it be nice to have the top of the line card for $200 again.


RE: I am glad
By Belard on 7/14/2008 2:59:45 PM , Rating: 2
When the Voodoo1 first came out (4mb card, 800x600 with 16bit grahics - if I remember right) it was a $200 video card that required a 2D video card to use Windows9x.

The MSRP on the Voodoo2 when it first came out was $400 with 12mb. It could do 1024x768! And some people bought two of them for the original SLI. When the Voodoo3 came out at $180, that was considered CHEAP.

Back in the days of Windows3.x in 1994 A cheap junky video card was $100 with 1mb of RAM and a 90 day warranty. Apple charged customers about $500 to have 16colors on their $5000 Mac II (1987) and $1000+ for 256 glorious colors!

$100 for a 3870, thats dirt cheap ;)


RE: I am glad
By RamboZZo on 7/14/2008 4:36:57 PM , Rating: 2
The Voodoo 2 was about $250 retail when it came out. I remember that very specifically because I bought two of them the first day they hit store shelves after I had my car broken into that day right after I bought the first and it was stolen from my car along with my radio. Still makes me mad to this day. It could actually only do 800X600. Only on SLI could it do 1024X728. It's still interesting that the TNT which came out afterwards and had performance at least equal to one V2 launched for a lot cheaper and didn't require a 2d card. You could also get the Riva 128's in the low to sub 100's.

You are right about pre-3d acceleration day video cards. They were a far bigger rip-off than anything else.


RE: I am glad
By Clauzii on 7/14/2008 8:34:11 PM , Rating: 2
Riva TNT 16MB was my first REAL 3D card. It played Need For Speed and Unreal - and I was happy. :)


RE: I am glad
By just4U on 7/15/2008 5:34:41 AM , Rating: 2
I didn't know much about computers back then but I do recall the Voodoo2 being about 350 on launch here in Canada.. Most of the cards I purchased from the Geforce all the way up to the 7900 hovered somewhere in and around the $500-700 with one exception. On launch the 8500 PRO was 399 and held it's own against the Geforce3 which I purchased for 650.


RE: I am glad
By Denithor on 7/16/2008 3:27:39 PM , Rating: 2
quote:
It's never going to happen but boy would it be nice to have the top of the line card for $200 again.


Well, keep in mind that $200 15 years ago is >$300 in today's dollars (assuming a mere 3% inflation per year).

I personally tend to stick to what's available in the $100-150 range, no matter how high the top end goes. If I'm not happy with the performance offered in that range, I simply wait a few months and prices come down as new higher-performance cards are released and previous top models become mainstream.


RE: I am glad
By Spivonious on 7/14/2008 12:02:39 PM , Rating: 2
My 3850 512MB runs everything I throw at it at adequate framerates (it can even do Crysis maxed out except AA at a playable framerate).

With that said, sure I'd love for things to be like the old days where $300 would get you the absolute top-of-the-line video card and played all games at 60+ fps. But for that to happen we'd have to convince all the people buying the $500+ cards to stop. Supply and demand in full effect.


RE: I am glad
By Lakku on 7/14/2008 3:03:30 PM , Rating: 2
Yeah, at 800x600 maybe. I ran Crysis at very high settings on a 720p with a 4k dollar computer (8800gtx overclocked was the best at the time) and it got about 21-25fps on avg. That is borderline adequate, meaning I don't believe you as the 3850 is quite a bit slower than a 8800gtx with a quad core at 3ghz. Unless of course you have a super CPU OC'd to 4+ GHz I suppose.


RE: I am glad
By ChronoReverse on 7/14/2008 3:36:58 PM , Rating: 2
Indeed. My 4850 can only handle mixed High and Medium settings with NoAA at 1680x1050 giving me about 30FPS.

There's no possibility a 3850 can run it "maxed out" at any reasonable resolution.


RE: I am glad
By ajvitaly on 7/14/2008 8:32:03 PM , Rating: 2
You need to get the triplecpack mod for crysis. It'll boost your frame rate by 15-20% and improve the graphics.


RE: I am glad
By Spivonious on 7/15/2008 11:15:12 AM , Rating: 2
I'd gladly sacrifice resolution for high quality settings. Luckily my CRT doesn't make every resolution except the native one look like crap. The colors are a lot better too :)


RE: I am glad
By Radnor on 7/14/2008 12:14:31 PM , Rating: 2
Yup, I am very glad also. Keeping the Greed at bay, and sturring up the market to deploy a better product. G80 and G92 were kings for too long. For the exception when the 3D market started (3dfx, Matrox, Ati, Nvidia, S3) that development cycles were a bit long, this last development cycle was an exception.

I hope it continue like this. Last generation took too long, for too much money as well. Good for the progress and great for us.

And Crysis is like Chuck Norris, unbeatable and only good for forum memes/jokes.


RE: I am glad
By MonkeyPaw on 7/14/2008 1:07:51 PM , Rating: 2
The only system that can handle Crysis is Chuck Norris.


RE: I am glad
By MrBlastman on 7/14/2008 1:37:48 PM , Rating: 2
Chuck Norris doesn't just handle Crysis, he roundhouse kicks it!


RE: I am glad
By shaw on 7/14/2008 2:38:27 PM , Rating: 3
There is no 'ctrl' button on Chuck Norris's computer. Chuck Norris is always in control.


RE: I am glad
By DeuceHalo on 7/14/2008 4:23:49 PM , Rating: 3
Now I'm having flashbacks to the Barrens chats in WoW. Thanks. ;) *shudders*


RE: I am glad
By Omega215D on 7/14/2008 3:08:35 PM , Rating: 2
It's a good thing that Crytek learned their lesson and with that Crysis Warhead will be a better performer on plenty of hardware. In the new issue of PCGamer They built a $700 rig that was able to play Warhead on high settings. Now if they would only be able to make some kind of a patch to do that with Crysis (I'm able to play it so I'm not complaining).


RE: I am glad
By Diosjenin on 7/14/2008 12:30:30 PM , Rating: 4
It's not just you - but it's been around for a while. If memory serves me right, the last time ATI was undoubtedly on top, the Radeon 9800 XT cost somewhere in the ~$500 range.

Just for fun, I looked up an inflation calculator (www.westegg.com/inflation). Harking back to 1992 - you know, Wolfenstein 3D and what not - a $300 GPU setup (and I have no recollection of what that would have been) would be $450 today.

Loaves of bread used to cost 5 cents once, too, people.


RE: I am glad
By CyborgTMT on 7/14/2008 1:21:31 PM , Rating: 2
Did a quick check of some cards I purchase in the past few years and what they would cost today:
Voodoo 3 3500 - $316
GeForce 3 - $426
8500 - $477
Geforce4 Ti - $467

I'll wait a couple of years before I look to see what both my 7950 GX2s @ $600 a pop come too. Guys, if your wife ever needs a reason for a divorce, spend $1200 on video cards - problem solved.


RE: I am glad
By StevoLincolnite on 7/14/2008 1:35:54 PM , Rating: 2
I dunno, the X850XT PE although it didn't support SM3 it was a seriously great card in comparison to the Geforce 6 series, heck it even out-performed the 7900GS when overclocked, not bad in my opinion.

The X1950Pro was also a solid choice in my opinion, As is there PCI Cards for older machines (Not PCI-E) a friend grabbed a PCI X1550 and it handles Oblivion fine on Medium Quality, he could have gotten the 2400Pro PCI card, but at the time it was not worth it for a little performance boost.

Now I wonder if there will be AGP Versions? I know people using Socket 939 Boards with an Athlon 64 X2 3600+ who would jump all over this, just so they could fold with it in the future.


RE: I am glad
By Omega215D on 7/14/2008 2:57:20 PM , Rating: 2
I think it was either Sapphire or PowerColor who came out with an AGP version of the 3850. Chances are they might do it again as it seems that the AGP bus isn't really a bottleneck. Now according to some sites getting these cards are worth it if you have a dual core proc on an AGP platform. I would hazard a guess that the 4850 will be available for AGP.


RE: I am glad
By SlyNine on 7/14/2008 6:08:16 PM , Rating: 2
I had a Neo 2 Plantnum and the A8N32X( I believe thats what it was called) Both AGP and Both could handle top of the line 939 chips. I even had one 4200 OCed to 3ghz. Plenty for pretty much any Video Card out there.

But I wonder how many people out there are in the same boat, its got to be worth it for the card makers to bother. I often find my self in a situation where i'm the only one trying whatever configuration.


RE: I am glad
By Spivonious on 7/15/2008 11:18:09 AM , Rating: 2
Heh, in 1992 GPUs weren't even in the imagination of computer users. My 386 16MHz with 2MB RAM ran Wolf3D just fine :)


RE: I am glad
By cubby1223 on 7/14/2008 1:07:30 PM , Rating: 2
And I also remember the 486DX2-66 cpu retail for $1,000. Or go back a little further and memory was $40 per MEGAbyte. Or a little further back and a floppy drive was a couple hundred dollars.

Look, seriously, the bottom line is that there are many people who will buy the $100 cards, and there are enough people who will go for the $500 cards. Video card manufacturers are in the business to make money. If they put their top of the line video cards at $100, I'd have to think it wouldn't increase sales enough, and actual revenues would be down.

Not to mention none of use actually knows the true financial numbers on these cards.


RE: I am glad
By SiN on 7/14/2008 2:07:13 PM , Rating: 5
They aren't lowering prices volanteraly, AMD/ATI beat them with a better chip for less money then the competition.

Im glad to see lower prices too but, and this is the important bit, Nvidia didn't lower the price. AMD/ATI did! Brilliant.


RE: I am glad
By bongsi21 on 7/14/2008 5:25:56 PM , Rating: 2
We need more competition. Support AMD video cards!FOR MORE PRICE CUTS!


RE: I am glad
By Oscarine on 7/17/2008 11:14:04 AM , Rating: 1
The 200SB was about 650, the X24 could be had for about 450, there were still expensive cards before that. Matrox Millenium and Millenium 2s were pretty expensive especially the 2 in the 16MB variant. Which I might add, I played Quake 2 on at 1600x1200 in SOFTWARE mode. No textures however heh.

I actually didn't like early 3d Hardware limited resolutions 640x480 for the Voodoo 1 unless you had the 100SB, 800x600 for non-SLI'd Voodoo 2s etc. The graphics were very very soft.


"There's no chance that the iPhone is going to get any significant market share. No chance." -- Microsoft CEO Steve Ballmer

Related Articles
NVIDIA Launches GTX 200 GPU Family
June 16, 2008, 3:07 PM













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki