backtop


Print 89 comment(s) - last by Schrag4.. on Jun 20 at 12:16 PM


  (Source: Amazon)

  (Source: Amazon)
AMD's newest is an alternative to NVIDIA's last generation high-end

Hot on the heels of NVIDIA's GTX 200 family launch, AMD will introduce its 55nm RV770-based Radeon 4850 next week. 

The Radeon 4850 features a 625 MHz core clock and GDDR3 clock in excess of 2000MHz. Corporate documentation explains that the 480 stream processors on the RV770 processor offer considerable enhancements over the 320 stream processors found in the RV670 core, though AMD memos reveal little about how this is accomplished.

The RV770 includes all the bells and whistles of the RV670 launched in November 2007: Shader Model 4.0, OpenGL 2.0, and DirectX 10.1.  The only major extension addition appears to be the addition of "Game Physics processing" -- indicating a potential platform for AMD's recent partnership with Havok.

The new Radeon lacks GDDR5 memory, promised by an AMD announcement just weeks ago. Although the RV770 does support GDDR5 memory, this initial launch consists exclusively of GDDR3 components.  AMD documentation hints at the launch of a Radeon 4870 later this summer, but it offered no comment on when it will eventually ship a GDDR5 product.

If Radeon 4850 sounds familiar, that's because it is. The RV770-based FireStream 9250, just announced a few days ago, broke the 1 teraflops barrier using the same graphics core.  However, this paper-launched workstation card will retail for more than $900 when it finally hits store shelves.  The mainstream Radeon 4850 offerings will ship and launch on the same day next week.

AMD partners claim the new card will not compete against the $600 GTX 200 just announced yesterday. Instead, AMD pits the Radeon 4850 against the recently re-priced NVIDIA GeForce 9800 GTX.  Distributors claim the 4850 will see prices as low as $199 at launch -- well under the $299 MSRP for GeForce 9800 GTX.  More expensive versions of RV770 will feature HDMI, audio pass-through and possibly the fabled Qimonda GDDR5 memory.

Specifications from Diamond Multimedia marketing material claim the new Radeon will require a 450 Watt power supply for single card support; or 550 Watt power for CrossFire mode.

Update 06/09/2008: As of this morning, AMD has lifted the embargo on its 4850 graphics cards. AMD's newest documentation claims the RV770 processor contains 800 shaders, but the card is not expected to show up on store shelves before the planned June 25 launch date.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Over $300 makes a video card irrelevant
By gochichi on 6/17/2008 4:27:11 PM , Rating: 5
Nvidia is going to have re-refresh their 8800GTS level of cards so they can turn a better profit from them.

Remember that ATI has a smaller process and is making a better profit for every HD3870 sold (as opposed to per 8800GT sold).

ATI has decided to keep it green, clean, and priced right and it's absolutely the right decision in my opinion. Computer gaming can't be based on $600 video cards... people don't want that, and they don't even need that.

People are thinking: Get a PS3, a XBOX 360 or a video card? They already have their new DELL, HP, or eMachine with a dual core processor... they will choose the video card if it's about $200.00 and doesn't need a $200 powersupply to work.

ATI, specifically the 4850 will be the card of choice for 12 months... I can almost guarantee that.

Anybody play Call of Duty 4? People are still using Radeon X1950 and single core processors... or 7900GT or whatever. Why get a $600 video card when you can a complete really excellent gaming machine for the same price? I play COD4 with max-settings no AA at 1920 x 1200 on a HD3870... I'm not about to spend a ton of money to turn 4xAA on... no way.

It's about fun and games, not about benchmarks, and I think ATI understands that. They did well to let NVIDIA release their overpriced junk (in a slowing economy no less), ATI is going to sell as many new Radeons as they can deliver.

The more time passes the more it seems that NVIDIA just stumbled upon the right formula with the 8800GT, now ATI is running with that formula, keeping the price and adding features and it will absolutely sell these units.




RE: Over $300 makes a video card irrelevant
By BruceLeet on 6/18/2008 3:25:50 AM , Rating: 2
4xAA isn't really needed on 1920x1200 anyway, COD4 has small maps. About the pricing I agree with you. I'm 100% confident my next upgrade or build will be of AMD offerings, that price:performance favors AMD, then there are people who pay attention to logos.

AMD should do Mac style commercials on Nvidia. Well not TV commercials, just little episodes you can see on their AMD GAME! site, just to see the kind of things to be said. Improv anyone?


RE: Over $300 makes a video card irrelevant
By freeagle on 6/18/2008 9:00:03 AM , Rating: 5
quote:
4xAA isn't really needed on 1920x1200 anyway, COD4 has small maps


Could you please explain the relationship between map size, video resolution and antialiasing?


By Goty on 6/18/2008 1:40:47 PM , Rating: 2
I don't know, but the aliasing is pretty bad at 1920x1200 on CoD4 for me. 2xAA is plenty, though, in that case.


RE: Over $300 makes a video card irrelevant
By BruceLeet on 6/18/2008 2:12:21 PM , Rating: 3
I play two games, one of them is viewable up to 1000 game meters. I need AA in that game, it's hard to see snipernoobs at a distance if they are near a wall of trees, in that particular game that is. Now COD4, I can shoot across most maps without having a problem seeing in 1920x1200.

Can you see where Im coming from yet?


RE: Over $300 makes a video card irrelevant
By ChronoReverse on 6/19/2008 1:45:33 PM , Rating: 2
AA doesn't just allow you to see things, it removes "jaggies" which are VERY visible even at 1920x1200 on most monitors. The jaggies won't go away unless you have a very good dot pitch and even then 2xAA would make a big difference.


By Martimus on 6/19/2008 4:21:53 PM , Rating: 2
In fact, AA would probably make characters harder to see from a distance, since their silouette would be smoother than without it. (One of the easiest ways to see characters versus the background is to look for the pixelated lines they create against the backdrop.)


By BruceLeet on 6/19/2008 4:57:51 PM , Rating: 2
Well try to think about it, your looking for an object in trees in a game at a distance, now thats alot of lines (the trees) and thats alot of jaggies. You simply can't make anything out thats where I need AA, but in COD4 no...fast paced game theres alot of movement. You need MAX fps in that game. So I dont use AA


RE: Over $300 makes a video card irrelevant
By jarman on 6/18/08, Rating: -1
RE: Over $300 makes a video card irrelevant
By Goty on 6/19/2008 1:36:28 AM , Rating: 5
Who's drinking the kool-aid now? AMD is launching a new card because *gasp* it's in their product cycle! A new architecture released a year after the previous one? It's unheard of!

As for your claims that ATI can only compete in the mid-range, what did you call the HD3870X2? What do you call the HD4850, a midrange card that performs with the 9800GTX using a die smaller than an American dime. What are you going to call the HD4870X2 that, as I've said before, should by all indicators outperform the GTX 280 by a healthy margin?


RE: Over $300 makes a video card irrelevant
By Alpha4 on 6/19/08, Rating: 0
RE: Over $300 makes a video card irrelevant
By NullSubroutine on 6/20/2008 6:35:34 AM , Rating: 3
I dont know if you are smoking something or just plain spreading FUD but the 4870 X2 (aka R700) will be released 8 weeks after the 4870, which is June 25th. So that places it in Aug.

They are waiting to get the drivers right before the launch but I have already seen benchmarks run on current drivers and its beating the GTX 280.


By Alpha4 on 6/20/2008 11:25:11 AM , Rating: 2
Just curious, where did you read about the 4870x2 release date? Or even the 4870 at that? The article says "AMD documentation hints at the launch of a Radeon 4870 later this summer".

I'll try to provide links later where I read about the 4870x2 release date. I'm @ work at the moment unfortunately.


RE: Over $300 makes a video card irrelevant
By jarman on 6/19/2008 6:14:35 PM , Rating: 2
Really? Tell me how that "product cycle" has influenced AMD's earnings over the last two quarters as compared to nVidia's...


By Goty on 6/20/2008 1:39:22 AM , Rating: 2
Gee, that's a little hard to say, isn't it, since the product didn't launch in the last two quarters?


RE: Over $300 makes a video card irrelevant
By Lightnix on 6/19/2008 2:40:38 PM , Rating: 3
The 4850 costs well under half the price of the GTX280 ($200 vs. $649) and two of them in crossfire seem to beat the GTX280 by a reasonable margin in a fair amount of scenarios. The 4870 X2 will be just this, but clocked higher on the core and much higher on the memory (GDDR5). AMD are planning on competing in the high end, just with two small cores rather than one huge one.


RE: Over $300 makes a video card irrelevant
By just4U on 6/19/2008 9:26:28 PM , Rating: 2
Yeah, I don't really factor Crossfire in myself. I just look at the card see that it's on par and beating the 9800GTX with a smaller footprint at the $200 price point and think ... Hmmmmmmm :)

I was very pleased with Ati and Nvidia when they had their respective launches this past Nov/Dec. Those mid range cards kicked things into high gear and you can tell that both companies made money off of it. Im glad that Amd has decided to continue the trend because that's where the profits are!


By Schrag4 on 6/20/2008 12:16:23 PM , Rating: 2
Agree about not factoring in Crossfire. If you're the type that will have dual cards in your system, then price really doesn't matter, right? Am I way off here? Can't you get more performance from putting a single, higher-end card vs buying 2 cheap cards? Which means the only people with multiple cards have 2 very high end cards (and don't care how much money they spend obviously).

As a real world example, I bought my NVidia 7800 GT almost 3 years ago for right around 300 bucks. I *could* put a second card in my machine, but I would see much improved performance if I just scrapped the 7800 and got a card that's 2 generations newer. And no, I'm not going to spend between 400 and 1300 bucks for a dual card solution. If I had that kind of money I wouldn't be using a 7800GT still today, would I...


Alarming?
By Creig on 6/17/2008 1:27:14 PM , Rating: 5
quote:
The most alarming thing about the new Radeon is the lack of GDDR5 memory, promised by an AMD announcement just weeks ago. Although the RV770 does support GDDR5 memory, this initial launch consists exclusively of GDDR3 components. AMD documentation hints at the launch of a Radeon 4870 later this Summer, but t offered no comment on when it will eventually ship a GDDR5 product.


Why is this alarming? The RV770 GPU supports GDDR3/4/5 and it's up to the OEMs to decide which memory (and how much) to place on their cards. The 4850 isn't positioned as a top end card so the manufacturer's decided (rightly) to put GDDR3 on them. Why would they want to pair top shelf memory with a mid-range card? I would imagine you'll find GDDR5 on the upcoming 4870 and 4870 X2.




RE: Alarming?
By KristopherKubicki (blog) on 6/17/2008 1:30:06 PM , Rating: 1
Because the Qimonda / AMD announcement seemed to imply we'd see GDDR5 sooner than later. It's going to be later.


RE: Alarming?
By ChronoReverse on 6/17/2008 1:34:07 PM , Rating: 5
It was well-known that the 4850 was a GDDR3 part while the 4870 would have GDDR5.

It could be said that it's alarming that the 4870 is late but the same can't be said for the 4850 which is what this article is about according to the title.


RE: Alarming?
By FITCamaro on 6/17/2008 2:09:22 PM , Rating: 3
Yeah I was about to say. It was my understanding that the 4850 was to be primarily GDDR3 and the 4870 would be the GDDR5 part.

If this thing starts as low as $199, thats a damn good deal.


RE: Alarming?
By defter on 6/17/2008 2:21:22 PM , Rating: 2
quote:
It could be said that it's alarming that the 4870 is late but the same can't be said for the 4850 which is what this article is about according to the title.


You should read the content. I don't understand why the author was modded down. He said quite clearly that lack of GDDR5 part (read: 4870) is alarming, because it was supposed to launch next week:

quote:
The most alarming thing about the new Radeon is the lack of GDDR5 memory, promised by an AMD announcement just weeks ago. Although the RV770 does support GDDR5 memory, this initial launch consists exclusively of GDDR3 components. AMD documentation hints at the launch of a Radeon 4870 later this Summer, but t offered no comment on when it will eventually ship a GDDR5 product.


RE: Alarming?
By Goty on 6/18/2008 1:39:17 PM , Rating: 3
The 4870 is due to launch a little later, I think. The 4850 was the only one of the two set to launch next week.

I could be wrong, though.


480 or 800?
By homerdog on 6/17/2008 2:41:25 PM , Rating: 2
Or is it 480 with some voodoo that makes them perform like 800? Somehow they're getting a teraflop out of this thing.




RE: 480 or 800?
By ChronoReverse on 6/17/2008 3:11:15 PM , Rating: 2
Should be 800. All the early results point to that (plus it's hard to get the supposed 1TFLOP, as you say, without 800).


RE: 480 or 800?
By homerdog on 6/17/2008 3:14:11 PM , Rating: 2
Okay, after a quick Google session it looks like 800 stream processors after all. There certainly is a lot of disinformation out there regarding RV770...


RE: 480 or 800?
By Parhel on 6/17/2008 4:35:17 PM , Rating: 2
RE: 480 or 800?
By homerdog on 6/17/2008 6:16:10 PM , Rating: 2
Heh, like I said, lots of mis/disinformation. I'm sticking with 800 as per the updated GPU-Z figures:
http://www.tweaktown.com/news/9691/index.html


RE: 480 or 800?
By ChronoReverse on 6/19/2008 10:28:05 AM , Rating: 2
A physical count of the die shot shows 180 units (meaning 20 are disabled for redundancy).


RE: 480 or 800?
By homerdog on 6/19/2008 3:01:27 PM , Rating: 2
Score :)


RE: 480 or 800?
By NullSubroutine on 6/20/2008 6:45:15 AM , Rating: 2
its 800 shaders, with 40 disabled (reserved for FireGL cards), 40 TMU's, 16 ROPs


Looks like a winner to me.
By pauldovi on 6/17/2008 1:25:30 PM , Rating: 5
Hopefully ATI's sensible prices will bring Nvidia down to Earth with their ridiculous pricing.

I await the power consumption, performance, and cost and hope it will be better than the 8800GT.

One thing that concerns me is no support of OpenGL 2.1?




RE: Looks like a winner to me.
By FITCamaro on 6/17/2008 2:09:59 PM , Rating: 2
Is OpenGL 2.1 even out?


RE: Looks like a winner to me.
By amanojaku on 6/17/2008 3:17:37 PM , Rating: 2
RE: Looks like a winner to me.
By emboss on 6/18/2008 10:21:28 AM , Rating: 2
For all intents and purposes, AMD has frozen their OpenGL 2 development. Feature-wise, it's still stuck somewhere between the R400 and R500. I remember a statement from ATI saying basically that they were concentrating on OpenGL 3 development (pretty much a ground-up rewrite), and not to expect much more on the OpenGL 2 branch.

Last I heard (which was before the latest OGL3 delay), ATI was saying they'll have OpenGL 3 drivers for the X1K and later released within 6 months of the spec being finalized. Assuming this is still the case, this means that they should come out before Feb or so.


Why the obsession with GDDR5?
By boogle on 6/18/2008 4:49:32 AM , Rating: 3
Why is everyone so obsessed with GDDR5? Lets not forget that each new version of DRAM increases bandwidth at the expense of latency, and graphics is extremely latency rated. Unless the DRAM is running at a significantly faster speed (expensive) to mask the added latency, then the performance difference is always very small. It just sounds better and manufacturers can quote super-high bandwidth figures to make people go 'ooooooh'. X1950XT and X1950XTX anyone?




RE: Why the obsession with GDDR5?
By Targon on 6/19/2008 2:11:09 PM , Rating: 2
Don't confuse GDDR with DDR, because there are differences. Unlike when we were seeing DDR-2 memory on video cards, the memory on video cards is NOT the same stuff you see for system memory.

GDDR3 != DDR-3


RE: Why the obsession with GDDR5?
By zsouthboy on 6/19/2008 2:13:43 PM , Rating: 3
Nonsense, graphics cards EXCEL at hiding memory latency.

Your standard CPU, on the other hand...


By Aeros on 6/17/2008 5:38:43 PM , Rating: 2
Unless it's a present for June 25th....




On its way!
By Pjotr on 6/19/2008 9:43:59 AM , Rating: 2
Komplett.se got this available today, Asus, Powercolor, HIS and Sapphire, see http://www.komplett.se/k/search.aspx?q=4850 . I ordered it and it's been shipped, only problem is that Friday is a public holiday in Sweden, grr! I guess it'll arrive Monday then.




By hanishkvc on 6/19/2008 3:31:44 PM , Rating: 2
Hi All,

Bit surprised that no one seems to have picked up on the GeForce 9800 GTX+ at dailytech (Maybe I should change my reading habits a bit ;-), which is supposed to debut in July for $229 thus making it a very competative Midrange card.

9800 GTX+ is built on the 55 nm process (similar to ATI 4000 series) and runs at higher clocks compared to 9800 GTX. Thus chances are GTX+ will beat the 4850 with a good margin, because a single 4850 barely matches the 9800 GTX performance based on reviews till now.

The only area we may have to think a bit is in double precision FP support and its impact on Scientific computations. Maybe there ATI 4850 could have a edge over NVidia 9800 GTX+. OTHERWISE the 800 SPs in 4850 is really __160__ processing cores with 5 processing paths which aren't equal to one another. So one cann't expect full 800 SP performance in all work loads. Thus the 800 SP is a misnommer in some sense. So lets not get carried away by it. However double precision support we have to see, maybe ATI 4000 series may have a edge over NVidia G92 series. But again NVidia G200 series is a slightly different ball game wrt double precision fp and I have no idea how that pits against the ATI 4000 series for now.

Keep ;-)
HanishKVC




Benchmarks
By Wirmish on 6/19/2008 3:49:26 PM , Rating: 2
http://tinyurl.com/4f7rl2 (french)
http://tinyurl.com/4s2rfw (french)
http://tinyurl.com/4e4cy9 (deutch)

HD 4850 --> $195 -> http://tinyurl.com/423ntf
9800 GTX -> $265 -> http://tinyurl.com/5k3vxh

1920x1050 4xAA 16xAF: HD 4850 = +20% > 9800 GTX




More DVI ports please
By CatfishKhan on 6/19/08, Rating: 0
Cool!
By FaceMaster on 6/17/08, Rating: -1
RE: Cool!
By Wartzay on 6/17/2008 1:49:32 PM , Rating: 2
http://www.xtremesystems.org/forums/showthread.php...

single midrange card, 30~ fps, 1900x1200, no AA

there is hope for the future!


RE: Cool!
By Pirks on 6/17/08, Rating: -1
RE: Cool!
By Parhel on 6/17/2008 3:37:45 PM , Rating: 3
Crysis Warhead isn't due out for maybe four months. If I were you, I might wait on that GTX 280. The 9800GX2 beats the GTX 280 by a considerable margin in regular Crysis. I wouldn't be surprised if the upcoming 4870X2 does also.


RE: Cool!
By ChronoReverse on 6/17/2008 3:43:44 PM , Rating: 2
http://www.forumdeluxx.de/forum/showthread.php?t=5...
Magic 8-ball Speculation says the 4850 CF already does.

With that said, I dislike CF and SLI because of the micro-stutter. If the 4870x2 really gets rid of that as rumoured, then it'll certainly be a smash.


RE: Cool!
By FITCamaro on 6/17/2008 4:14:57 PM , Rating: 2
Micro-stutter?


RE: Cool!
By ChronoReverse on 6/17/2008 4:21:48 PM , Rating: 3
It's a term coined for the effect multiple GPUs have on the framerate when using AFR mode.

It's caused by the frames not being displayed smoothly because the multiple GPUs do not produce the frames in a steady stream but rather in bursts of n (where n is the number of cards).

While not a problem for everyone the effect does exist and is the reason why a single GPU tends to produce a smoother framerate.

There are rumours that the 4870x2 has something that will mitigate this (apparently it's not a simple solution... the rumours are about memory sharing between the GPUs) but I wouldn't put any hope into that until it's seen.


RE: Cool!
By hadifa on 6/17/2008 7:34:40 PM , Rating: 2
And it is mostly associated with low frame rate situations which is a category where Crysis obviously falls under.


RE: Cool!
By NullSubroutine on 6/20/2008 6:38:41 AM , Rating: 2
from the AMD slides I have seen there is no pooled memory (unfortuneatly). From what I can tell from their slides its the same design as the R680 but with a PCI-E 2.0 16 bridge chip rather than the 1.1. And that they will use GDDR5.


RE: Cool!
By Min Jia on 6/18/08, Rating: -1
RE: Cool!
By FITCamaro on 6/19/2008 2:23:42 PM , Rating: 5
Wait so you bought 2 9800GX2s in mid-March and now less than 3 months later replaced them?

Clearly you have more money than sense or a serious e-penis deficiency.


RE: Cool!
By Alpha4 on 6/19/2008 7:34:15 PM , Rating: 2
The "More money than sense" part is entirely subjective. And you needn't judge or insult. Who's to say he isn't a games developer or is hoping to leverage the 280's CUDA support?


RE: Cool!
By Alpha4 on 6/19/2008 7:38:28 PM , Rating: 1
Okay maybe not entirely subject. I definitely see where you're coming from. But that doesn't mean I don't love my 1024MB E-penis!


RE: Cool!
By yxalitis on 6/17/2008 7:40:40 PM , Rating: 2
AMD have superior XP drivers, and they support DX10.1, which nVidia still haven' managed.
If nVidia did, more developers would use DX10.1, and not have to pull it from their game under pressure from nVidia...like Assassin's Creed


RE: Cool!
By leexgx on 6/18/2008 12:23:17 AM , Rating: 2
i just gone and got an second 8800GTX for £150

2 8800 gt seem to give 1 gtx280 an run for its £400+ price so 2 8800 GTX cards should keep me happy for some time (unless i sell my compleat setup in the next 2-3 weeks :) )

still is an disapointment Nvidia are not putting the DX10.1 in there next gen video card, games would benrfit from some of the stuff thats in 10.1 (Assassin's Creed is one but pulled due to bug had to be removed as it was not working on nvidia cards even thought it does not support it)


RE: Cool!
By Parhel on 6/17/08, Rating: -1
Seems...
By DeepBlue1975 on 6/17/08, Rating: -1
RE: Seems...
By Lightning III on 6/17/2008 1:53:45 PM , Rating: 5
80 Percent of the performance at less than half the price seems they are doing just fine by me.

At 199 this will be the best bang for the buck card till the 4870 comes out.


RE: Seems...
By deeznuts on 6/17/2008 2:14:28 PM , Rating: 2
Is this really going to be $199? Damn Xbox or upgrade my computer/video card? I want to play Bioshock and Mass Effect now!

Getting nehalem in the winter. So I might as well wait for then, and just get a 360 I guess.

Hey, does anyone remmeber the anandtech or other article a few years back, where nvidia and ATI both said that there will be one or two generations of all out performance and power usage, but then the next phase will be power reduction and efficiency? Has that started yet? will it ever start?


RE: Seems...
By ChronoReverse on 6/17/2008 2:17:58 PM , Rating: 4
Nvidia went performance while ATI went power reduction this round.

That's why Nvidia has the undisputed high-end


RE: Seems...
By SavagePotato on 6/17/2008 5:56:52 PM , Rating: 1
That may not be the case.

Initial estimates were that the 4870x2 will be 1.25x faster than the 9800gx2.

If that holds true then ATI will hold the performance crown, and do it for less than the cost of the gtx280 by far.


RE: Seems...
By AnnihilatorX on 6/17/2008 6:19:42 PM , Rating: 2
4870x2 is just 2 4870s together. And I believe 4870s will sell at a bit less than half the price of GTX280 initially, until Nvidia drops the price which it will have to if that's the case.


RE: Seems...
By ChronoReverse on 6/17/08, Rating: 0
RE: Seems...
By decapitator666 on 6/18/2008 6:13:23 PM , Rating: 3
at the cost of a new car?


RE: Seems...
By ChronoReverse on 6/19/2008 1:47:47 PM , Rating: 1
Obviously the point is that Nvidia can still claim to have _the_ fastest solution regardless of how expensive or impractical it is.

Of course, those who downrated my comment either completely missed that or are Nvidia types who just downrate anything that bashes the GTX280 (and its power consumption at load).


RE: Seems...
By DeepBlue1975 on 6/17/2008 7:57:16 PM , Rating: 2
Yeah, I think so, but that was not the point of my previous post.

Not the majority of the market analyze what is the best bang for the buck, they just ask some "tech savvy" what's best and usually that "tech savvy guy" will point him towards the brand that has the better reputation out there in the high end market, even if he's advising the other guy to buy a mainstream product.


RE: Seems...
By FITCamaro on 6/17/2008 2:22:23 PM , Rating: 5
Which cards do you think sell more? Ones that are $400-650 or ones that are $200-300? Now ask me which you think AMD will be able to make more money off of. Yes having the top performing cards gives you bragging rights. But bragging rights don't keep you in business.

Besides, I think most people are more about having a gaming PC that they can sit next to without going deaf or requiring a cooling suit to bear than having an extra 5 FPS.


RE: Seems...
By FITCamaro on 6/17/2008 2:23:45 PM , Rating: 5
Also an OEM is far more likely to use your card in their system when its a) cheaper and b) uses less power.


RE: Seems...
By ChronoReverse on 6/17/2008 2:25:16 PM , Rating: 2
Well, it also depends on how much it costs to make the cards.

The 4850 has GDDR3 and a 256bit memory interface. It's also a relatively small chip. This means it should be quite profitable for AMD to sell them at $200.

On the other hand, the G92 9800GTX, while also using GDDR3 and 256bit memory interface, is a larger chip and thus costs more for Nvidia to make. Performance also looks to be similar to the 4850 (not at all confirmed).


RE: Seems...
By DeepBlue1975 on 6/19/2008 11:37:53 AM , Rating: 2
You don't get the point.
I'm talking about marketing strategies. Highest end products are not about sales volume, they can give the selling brand an image of leadership in the market which you don't get with high volume, low cost, low profit products. That doesn't mean a company should stop making low end products, my point was that they shouldn't forget the image-building high end market because, not having a strong presence there, hurts market share in the long run.
And OEMs usually tend to team up with the market's leading brands (remember "Intel inside" campaign? I thought so).
If a brand settles for mid to low end parts only for a long period, they end up being taken for a low quality, not trustworthy brand in the mind of the public.

You probably know that flagship products are not as much about selling them as they are about publicity and bragging rights for the brand that produces them.

I for myself usually get mainstream products for my machine except for some specific parts (mobo, psu, keyboard, mouse), and as for video performance, I couldn't care less as I almost don't play games any more. But then again, I'm not talking about what I buy or would buy, but instead about a brand's public image.


RE: Seems...
By ChronoReverse on 6/19/2008 1:49:56 PM , Rating: 3
With that said, ATI's strategy is that their high end card is the 4870x2. It's unfortunate that it's coming late but it is meant to go up against the GTX280.

ATI knew they were unlikely to be able to and also didn't engineer for a single chip to beat Nvidia but went for a (hopefully) scalable multi-GPU design from the start.

We'll see in a couple months whether this will be a win or not.


RE: Seems...
By Belard on 6/20/2008 5:20:09 AM , Rating: 2
Bragging rights (even if not true) can help sell product. Its the same with CPUs. Look at Intel and AMD. Since VIA KT-133A chipset and AMD XP CPUs, AMD had CPUs that were generally faster than Pentium 4 (other than video encoding) and with AMD64, they continued to murder the Netburst technology of Intel. Yet, AMD slowly gained market share while Intel continue to play BLUE-MAN group ADs about how much faster their CPUs were. At AMD best in the summer of 2006, AMD had about 20% of the market with NO TV ads. Pretty much any store you went to, AMD was actually selling quite well. Remember, the $250 AMD64 2.2Ghz CPU was able to play games and do typical office work faster than Intel's $1000 Extreme P4s 3.2Ghz. Core2 came out, it was very cheap, far less than the P4 line and AMD CPUs and faster. The respect and market gains of AMD fell through the floor. It sucks because not ALL AMD CPUs are slower than all Intel CPUs. At various price points and cost of Motherboard with on-board graphics - the AMD setup ends up costing a lot less than intel. The Core2 design is not so much faster over AMD compared to when AMD was on top of P4s.

AMD should have known there is only so much brand loyality a person will handle. I generally prefer AMD, and for most people - its a great chip & setup for the price... just not the fastest. They screwed up on the X4 CPUs and with the Motherboard makers by not making sure things were up to spec. So using an X4 AMD CPU may cause the latest motherboard to blow up! For low-end to mid/high end, AMD is a better deal. For overclocking and top-speed, intel... but you'll still pay for it.

Gigabyte AMD 770 : $80 = PCIe 2.0 slots, RAID, Firewire
Gigabyte intel P35 : $90 = PCIe 1.0 slots.
Gigabyte intel P35 : $120 = PCIe 1.0 slots, RAID
Gigabyte intel P35 : $180 = PCIe 1.0 w/ 2 16x slots (one is electrical 4x) - RAID, Firewire.

Gigabyte intel X48 : $225 = PCIe 1.0 slots (2 16x), RAID, Firewire, eSata (bracket)
Gigaybe AMD 790fx : $180 = PCIe 2.0 slots (2 16x), RAID, FW, eSata.

With that said, because the C2Q 6600 is $190 with a plain $100 board, its a BETTER deal than buying an AMD X4 CPU and will be my next purchase. I get a cheaper, faster and more reliable system. But I'll be looking at dropping in the AMD 4850 video card :)

Its no secret that selling in the $100~200 is the SWEET spot for the industry. yeah, they sell a whole lot of $50~100, but those are rarely gamers. Pretty much every video card I've bought has been $125~225.

I agree thou, its silly to spend an extra $1000~2000 to go from 80fps to 130fps on a 24" LCD monitor.


RE: Seems...
By Goty on 6/19/2008 1:44:53 AM , Rating: 2
High end and high profit don't go together. You might be surprised to know that ATI and NVIDIA both make a hell of a lot more money off of the midrange and entry level/integrated market than they do off of the high end.


RE: Seems...
By NuclearDelta on 6/19/2008 3:58:43 AM , Rating: 2
ATI recovery, I like it.


Bad article
By Proteusza on 6/17/08, Rating: -1
RE: Bad article
By KristopherKubicki (blog) on 6/17/08, Rating: 0
RE: Bad article
By gaakf on 6/17/2008 3:59:01 PM , Rating: 5
Only from a pure performance perspective is it inferior. But at a $200 price point, performance/dollar it should be superior. The card is more feature rich than anything Nvidia has right now. I would also assume from the die shrink that it consumes less power as well, making it superior on a performance/watt scale. So no, I do not believe AMD would tell me their product is inferior.


RE: Bad article
By FITCamaro on 6/17/2008 4:20:34 PM , Rating: 5
He has a point. Like it or not, use it or not, ATI has had DX10.1 support for months now. While Nvidia implemented some of it with the FX 2x0 series, its still not all there. Developers might have largely said its not important, but tell that to the developers who are using it.

And a recent article showed that with Assassin's Creed, the exe that utilized 10.1 ran faster than without. They removed it because it is "The Way its Meant to be Played" game and Nvidia wouldn't like ATI's card performing better than theirs with features they don't have.

Just shows you the politics that exist in the video game market.

I really wish they had gotten the 4850 out sooner so I could have gone with it instead of two 8800GTSs.


RE: Bad article
By SavagePotato on 6/17/2008 6:01:08 PM , Rating: 4
Considering price wise it is competition for the 8800gt and it performs on par with a 9800gtx, it sounds to me like the ati offering is not inferior but superior.

The GTX280 is a big fat turkey that is overpriced and lacking.

The way things are shaping up the 4870x2 might indeed trump the GTX280.


RE: Bad article
By bill3 on 6/18/2008 8:16:54 AM , Rating: 5
It's a 200 video card said to be as fast as the 9800GTX, which currently costs about 300. I'd say that's pretty good.

Then the later 4870 at 329 will likely give the GT260 if not 280 a run for it's money. Then finally, it's likely 4870X2 will be faster than GT280. Leaked 3Dmark scores show 4870X2 with an edge over the 280. When was the last time AMD's fastest card was better than Nvidia's???

Also, look at value, leaked scores also show 4850 crossfire stomping GT280 for $250 less.

I think people are jumping on you Kubicki because I cant even put my finger on it, but the article just seems pretty negative on AMd's prospects. In contrast most of the web seems to be realizing AMD is likely to have some kickass products here.

Seems to me Nvidia is struggling this go round. We all know Gt200 is a 576 mm die, while RV770 is just 256. Yet for all that AMD smashed a teraflop and 4800 line seems blazing fast.

The fact AMD packed 800 sp's into a die HALF the size of Nvidias is pretty amazing, and makes one wonder what the hell Nvidia engineers are doing.

It appears to me AMD was right when they said the era of the monolithic chip is over. And Nvidia looks wrong. In raw performance AMD slides claim 2.8X more performance (flops) per die area than Nvidia. And 2.25X per watt.


RE: Bad article
By KristopherKubicki (blog) on 6/18/2008 2:26:33 PM , Rating: 1
I suppose. It seems pretty obvious that it's a midrange card, and I don't know why anyone would contest that. I've seen "leaked" 3DMark scores that show a Crossfire 4850 beating up a single GTX 280 as well from an ASUS employee:

http://www.kevingu.com/?p=11

But a look at 3dMark's leadboard seems to show a lot of disparity between Kevin's score and others who already have the card:

http://service.futuremark.com/search/3dmarkvantage...

The 4850 is ultimately the card I'm going to get. It's by far the best price/performance card. But then again I always buy midrange cards.


RE: Bad article
By Goty on 6/19/2008 1:41:49 AM , Rating: 3
Ummm... his results seem to be perfectly in line with a stock GTX 280.


RE: Bad article
By Goty on 6/18/2008 1:37:00 PM , Rating: 5
It's a $200 video card that, when implemented in a dual-chip configuration, is at LEAST as fast as the GTX 280 by all indicators. Let's take $200x2 to get a rough price of ~$400 for a Radeon HD4850X2 (or a crossfire setup) and compare that to $500-$600 for a GTX 280 that performs almost identically.

Better features, the same performance, and $100 or more less? I'm not complaining.

This isn't even considering the performance gains to be had by the HD4870, either.

Is it just me, or does this generation seem to be shaping up to be the Radeon 8500 to 9700 jump all over again. Obviously NVIDIA's card doesn't suck this time around, but ATI is definitely posed to make a decent move in the market.


RE: Bad article
By Chaser on 6/17/2008 3:59:01 PM , Rating: 2
The 9800GTX is far from a slouch. The lowest price I could find for it is $269.00. The 4850 starting at $199.00 sounds like a pretty good deal to me.

The upcoming 4870 with it's announced GDDR-5 memory sounds like it will be a competitive "high end" card to me.

We shall see.


RE: Bad article
By rgsaunders on 6/19/2008 5:04:10 PM , Rating: 2
http://fxvideocards.com/VisionTek-Radeon-HD-4850-p... 195 is a pretty good price, have seen it listed for 190 elsewhere.

http://www.tweaktown.com/news/9703/amd_radeon_hd_4... for some interesting first reviews, including a Crossfire review.

Good bang for buck.


"We’re Apple. We don’t wear suits. We don’t even own suits." -- Apple CEO Steve Jobs














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki