backtop


Print 89 comment(s) - last by Schrag4.. on Jun 20 at 12:16 PM


  (Source: Amazon)

  (Source: Amazon)
AMD's newest is an alternative to NVIDIA's last generation high-end

Hot on the heels of NVIDIA's GTX 200 family launch, AMD will introduce its 55nm RV770-based Radeon 4850 next week. 

The Radeon 4850 features a 625 MHz core clock and GDDR3 clock in excess of 2000MHz. Corporate documentation explains that the 480 stream processors on the RV770 processor offer considerable enhancements over the 320 stream processors found in the RV670 core, though AMD memos reveal little about how this is accomplished.

The RV770 includes all the bells and whistles of the RV670 launched in November 2007: Shader Model 4.0, OpenGL 2.0, and DirectX 10.1.  The only major extension addition appears to be the addition of "Game Physics processing" -- indicating a potential platform for AMD's recent partnership with Havok.

The new Radeon lacks GDDR5 memory, promised by an AMD announcement just weeks ago. Although the RV770 does support GDDR5 memory, this initial launch consists exclusively of GDDR3 components.  AMD documentation hints at the launch of a Radeon 4870 later this summer, but it offered no comment on when it will eventually ship a GDDR5 product.

If Radeon 4850 sounds familiar, that's because it is. The RV770-based FireStream 9250, just announced a few days ago, broke the 1 teraflops barrier using the same graphics core.  However, this paper-launched workstation card will retail for more than $900 when it finally hits store shelves.  The mainstream Radeon 4850 offerings will ship and launch on the same day next week.

AMD partners claim the new card will not compete against the $600 GTX 200 just announced yesterday. Instead, AMD pits the Radeon 4850 against the recently re-priced NVIDIA GeForce 9800 GTX.  Distributors claim the 4850 will see prices as low as $199 at launch -- well under the $299 MSRP for GeForce 9800 GTX.  More expensive versions of RV770 will feature HDMI, audio pass-through and possibly the fabled Qimonda GDDR5 memory.

Specifications from Diamond Multimedia marketing material claim the new Radeon will require a 450 Watt power supply for single card support; or 550 Watt power for CrossFire mode.

Update 06/09/2008: As of this morning, AMD has lifted the embargo on its 4850 graphics cards. AMD's newest documentation claims the RV770 processor contains 800 shaders, but the card is not expected to show up on store shelves before the planned June 25 launch date.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Over $300 makes a video card irrelevant
By gochichi on 6/17/2008 4:27:11 PM , Rating: 5
Nvidia is going to have re-refresh their 8800GTS level of cards so they can turn a better profit from them.

Remember that ATI has a smaller process and is making a better profit for every HD3870 sold (as opposed to per 8800GT sold).

ATI has decided to keep it green, clean, and priced right and it's absolutely the right decision in my opinion. Computer gaming can't be based on $600 video cards... people don't want that, and they don't even need that.

People are thinking: Get a PS3, a XBOX 360 or a video card? They already have their new DELL, HP, or eMachine with a dual core processor... they will choose the video card if it's about $200.00 and doesn't need a $200 powersupply to work.

ATI, specifically the 4850 will be the card of choice for 12 months... I can almost guarantee that.

Anybody play Call of Duty 4? People are still using Radeon X1950 and single core processors... or 7900GT or whatever. Why get a $600 video card when you can a complete really excellent gaming machine for the same price? I play COD4 with max-settings no AA at 1920 x 1200 on a HD3870... I'm not about to spend a ton of money to turn 4xAA on... no way.

It's about fun and games, not about benchmarks, and I think ATI understands that. They did well to let NVIDIA release their overpriced junk (in a slowing economy no less), ATI is going to sell as many new Radeons as they can deliver.

The more time passes the more it seems that NVIDIA just stumbled upon the right formula with the 8800GT, now ATI is running with that formula, keeping the price and adding features and it will absolutely sell these units.




RE: Over $300 makes a video card irrelevant
By BruceLeet on 6/18/2008 3:25:50 AM , Rating: 2
4xAA isn't really needed on 1920x1200 anyway, COD4 has small maps. About the pricing I agree with you. I'm 100% confident my next upgrade or build will be of AMD offerings, that price:performance favors AMD, then there are people who pay attention to logos.

AMD should do Mac style commercials on Nvidia. Well not TV commercials, just little episodes you can see on their AMD GAME! site, just to see the kind of things to be said. Improv anyone?


RE: Over $300 makes a video card irrelevant
By freeagle on 6/18/2008 9:00:03 AM , Rating: 5
quote:
4xAA isn't really needed on 1920x1200 anyway, COD4 has small maps


Could you please explain the relationship between map size, video resolution and antialiasing?


By Goty on 6/18/2008 1:40:47 PM , Rating: 2
I don't know, but the aliasing is pretty bad at 1920x1200 on CoD4 for me. 2xAA is plenty, though, in that case.


RE: Over $300 makes a video card irrelevant
By BruceLeet on 6/18/2008 2:12:21 PM , Rating: 3
I play two games, one of them is viewable up to 1000 game meters. I need AA in that game, it's hard to see snipernoobs at a distance if they are near a wall of trees, in that particular game that is. Now COD4, I can shoot across most maps without having a problem seeing in 1920x1200.

Can you see where Im coming from yet?


RE: Over $300 makes a video card irrelevant
By ChronoReverse on 6/19/2008 1:45:33 PM , Rating: 2
AA doesn't just allow you to see things, it removes "jaggies" which are VERY visible even at 1920x1200 on most monitors. The jaggies won't go away unless you have a very good dot pitch and even then 2xAA would make a big difference.


By Martimus on 6/19/2008 4:21:53 PM , Rating: 2
In fact, AA would probably make characters harder to see from a distance, since their silouette would be smoother than without it. (One of the easiest ways to see characters versus the background is to look for the pixelated lines they create against the backdrop.)


By BruceLeet on 6/19/2008 4:57:51 PM , Rating: 2
Well try to think about it, your looking for an object in trees in a game at a distance, now thats alot of lines (the trees) and thats alot of jaggies. You simply can't make anything out thats where I need AA, but in COD4 no...fast paced game theres alot of movement. You need MAX fps in that game. So I dont use AA


RE: Over $300 makes a video card irrelevant
By jarman on 6/18/08, Rating: -1
RE: Over $300 makes a video card irrelevant
By Goty on 6/19/2008 1:36:28 AM , Rating: 5
Who's drinking the kool-aid now? AMD is launching a new card because *gasp* it's in their product cycle! A new architecture released a year after the previous one? It's unheard of!

As for your claims that ATI can only compete in the mid-range, what did you call the HD3870X2? What do you call the HD4850, a midrange card that performs with the 9800GTX using a die smaller than an American dime. What are you going to call the HD4870X2 that, as I've said before, should by all indicators outperform the GTX 280 by a healthy margin?


RE: Over $300 makes a video card irrelevant
By Alpha4 on 6/19/08, Rating: 0
RE: Over $300 makes a video card irrelevant
By NullSubroutine on 6/20/2008 6:35:34 AM , Rating: 3
I dont know if you are smoking something or just plain spreading FUD but the 4870 X2 (aka R700) will be released 8 weeks after the 4870, which is June 25th. So that places it in Aug.

They are waiting to get the drivers right before the launch but I have already seen benchmarks run on current drivers and its beating the GTX 280.


By Alpha4 on 6/20/2008 11:25:11 AM , Rating: 2
Just curious, where did you read about the 4870x2 release date? Or even the 4870 at that? The article says "AMD documentation hints at the launch of a Radeon 4870 later this summer".

I'll try to provide links later where I read about the 4870x2 release date. I'm @ work at the moment unfortunately.


RE: Over $300 makes a video card irrelevant
By jarman on 6/19/2008 6:14:35 PM , Rating: 2
Really? Tell me how that "product cycle" has influenced AMD's earnings over the last two quarters as compared to nVidia's...


By Goty on 6/20/2008 1:39:22 AM , Rating: 2
Gee, that's a little hard to say, isn't it, since the product didn't launch in the last two quarters?


RE: Over $300 makes a video card irrelevant
By Lightnix on 6/19/2008 2:40:38 PM , Rating: 3
The 4850 costs well under half the price of the GTX280 ($200 vs. $649) and two of them in crossfire seem to beat the GTX280 by a reasonable margin in a fair amount of scenarios. The 4870 X2 will be just this, but clocked higher on the core and much higher on the memory (GDDR5). AMD are planning on competing in the high end, just with two small cores rather than one huge one.


RE: Over $300 makes a video card irrelevant
By just4U on 6/19/2008 9:26:28 PM , Rating: 2
Yeah, I don't really factor Crossfire in myself. I just look at the card see that it's on par and beating the 9800GTX with a smaller footprint at the $200 price point and think ... Hmmmmmmm :)

I was very pleased with Ati and Nvidia when they had their respective launches this past Nov/Dec. Those mid range cards kicked things into high gear and you can tell that both companies made money off of it. Im glad that Amd has decided to continue the trend because that's where the profits are!


By Schrag4 on 6/20/2008 12:16:23 PM , Rating: 2
Agree about not factoring in Crossfire. If you're the type that will have dual cards in your system, then price really doesn't matter, right? Am I way off here? Can't you get more performance from putting a single, higher-end card vs buying 2 cheap cards? Which means the only people with multiple cards have 2 very high end cards (and don't care how much money they spend obviously).

As a real world example, I bought my NVidia 7800 GT almost 3 years ago for right around 300 bucks. I *could* put a second card in my machine, but I would see much improved performance if I just scrapped the 7800 and got a card that's 2 generations newer. And no, I'm not going to spend between 400 and 1300 bucks for a dual card solution. If I had that kind of money I wouldn't be using a 7800GT still today, would I...


"I modded down, down, down, and the flames went higher." -- Sven Olsen














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki