backtop


Print 65 comment(s) - last by Lazarus Dark.. on Sep 27 at 12:15 AM


Comparison data from ATI (Click to expand)
ATI's roadmap shows the introduction of mainstream DX11 hardware

When ATI first launched the Radeon HD 4870 last year, it delivered a new level of performance at a great price point. NVIDIA struggled to compete, but still managed to hold onto a lot of its marketshare.

With the launch of Windows 7 and DirectX 11 on October 22, many video card buyers are already looking for DX11 cards in order be ready for the latest games and to future-proof their systems. ATI is giving the public that option by launching the Radeon 5800 series today.

The Radeon HD 5870 and the Radeon HD 5850 both use the same GPU core, previously codenamed Cypress. Comprised of 2.15 billion transistors, it has almost three times as many transistors as Intel's Core i7 CPU. The Radeon 5870 runs at a core clock of 850MHz, while the Radeon 5850 runs at 725MHz.

The Radeon 5870 has an impressive 1600 Stream processors and is capable of delivering up to 2.72 teraFLOPS. It is the most powerful single GPU video card currently available and will sell for around $379 at e-tailers. Meanwhile, the Radeon 5850 has 1440 Stream processors and will be priced around $259. Both cards are paired with 1GB of GDDR5 memory, and will use a dual slot design.

One of the more interesting things that ATI is doing is called Eyefinity. It allows the use of up to three monitors together through a single video card. All cards in the 5800 series will have two dual-link DVI ports, a HDMI port, and a DisplayPort. Since a TMDS transmitter is needed for each DVI or HDMI port, the use of a DisplayPort monitor or adapter is needed to use the third monitor. Eyefinity is capable of support three displays at a resolution of up to 2560x1600, but future models will support up to six monitors.

Power efficiency is something that the computer industry as a whole has been moving towards. The Radeon 5870 has an idle board power of 27 watts, compared to 90 watts with the Radeon HD 4870. This due to better power circuitry and a greater reduction in clock speed at idle. Unlike newer CPUs, it does not have the ability to shut down idle processors, but that is something that is inevitable in the future in GPUs.

ATI first introduced the 40nm process with the RV740 core used in the Radeon HD 4770. With the second generation process from TSMC, ATI's yields are quite high on the 40nm node. DailyTech has been told that the launch is "rock hard" and there will be products available today.

Being the first DX11 parts, there is a huge pent-up demand that will take several weeks to dissipate. Availability is therefore expected to be tight for the next two weeks. However, demand is expected to rise again around the October 22 launch of Windows 7.

Most video card manufacturers are expected to use the reference design to save costs, but some are already planning modifications to the cooler in order to enable better overclocking or less noise.

ATI has plans to ship more DirectX 11 parts in the near future. Hemlock combines two Cypress cores together into a Radeon HD 5870 X2 configuration, while Juniper will be the 5800 series equivalent of the Radeon 4830 selling at the $199 price point.

DirectX 11 hits the mainstream early next year with two mainstream products. It is likely to be known as the Radeon 5600 series comprised of the Radeon HD 5670 and Radeon HD 5650. Ultimately, the technology seen in the Radeon 5800 will move into mobile computers and integrated chipsets over the next year.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Nvidia are definitely on the back foot....
By Amiga500 on 9/23/2009 2:50:02 PM , Rating: 5
How long has it been since they let ATi do something like this without some kind of distraction in the press?

GT300 must be giving them real headaches...

Anyway, the 5870 looks a real step forward, nearing the 295 in performance and trouncing it in Crossfire. The crossfire performance bodes well for the 5870x2.




RE: Nvidia are definitely on the back foot....
By omnicronx on 9/23/2009 2:53:23 PM , Rating: 5
I think the big thing here is that it will be the only DX11 card out when Windows 7 is released, and that is huge for AMD/ATI. That means developers will likely optimize games for these cards first, which was an advantage Nvidia had last time around.


RE: Nvidia are definitely on the back foot....
By Cypherdude1 on 9/24/2009 5:51:26 AM , Rating: 2
quote:
the use of a DisplayPort monitor or adapter is needed to use the third monitor

What does this mean exactly? If you have a third DVI LCD monitor, is there a DisplayPort to DVI adapter available which you can use to connect it to this card?


RE: Nvidia are definitely on the back foot....
By soundgy on 9/24/2009 7:55:30 AM , Rating: 2
Yes. They already have a few of these adapters out now. Just like the adapters used on the newest MacBookPros that have mini Display Port out instead of DVI


RE: Nvidia are definitely on the back foot....
By Hieyeck on 9/24/2009 8:33:06 AM , Rating: 2
What is this new displayport (Kidding)?! I just got done removing our final VGA monitor at work yesterday (Not kidding)!

Seriously, didn't DVI FINALLY take hold just a few years back? IIRC displayport and DVI have similar bandwidth too, why the shift? And how would adapters work, aren't they electrically incompatible?


RE: Nvidia are definitely on the back foot....
By soundgy on 9/24/2009 9:22:43 AM , Rating: 2
Unfortunately I do not know how they work (Fairy Dust!). But I do know that DisplayPort and miniDisplayPort can carry digital audio where DVI can't. And Display port has the mini port which is awesome for next generation graphics cards where they can have 6 ports lined up, yet only take up a single/double slot. Anyone else care to add?


By MikeMurphy on 9/24/2009 4:12:16 PM , Rating: 5
DisplayPort is packet based while DVI/HDMI are not.

It will allow all sorts of data to be transferred over the same cable that handles your video. Examples include audio, usb, and esata connectivity from your monitor without the need for any additional cables.

Its arguably what HDMI should have been.


RE: Nvidia are definitely on the back foot....
By ChuckDriver on 9/24/2009 9:42:11 AM , Rating: 2
quote:
I just got done removing our final VGA monitor at work yesterday (Not kidding)!


I've been doing some work at a place where someone bought some really cheap Dell Inspirons in the Fall of 2006. They are running what must be the last LCDs made with only an analog connection. This is frustrating when $30 dollars more would get the same monitor plus a DVI input. Unfortunately for them, they'll be dealing with these things well into the next decade.


RE: Nvidia are definitely on the back foot....
By Cypherdude1 on 9/24/2009 11:29:44 AM , Rating: 3
quote:
I've been doing some work at a place where someone bought some really cheap Dell Inspirons in the Fall of 2006. They are running what must be the last LCDs made with only an analog connection.

Why should you care what type of connection the monitor uses? If you're looking at the connection, you're looking at the wrong side of the monitor. You're supposed to look at the face of the monitor, not the back. Does it really matter whether it's connected via Analog RGB, DVI, or DisplayPort? Personally, as long as the picture is sharp, the refresh rate is high enough, and I can do the work, it doesn't make any difference how it's connected.

The same goes for the DirectX editions. Are there really any noticeable performance and quality differences between 9, 10, and 10.1? Side by side, would anyone be able to tell the difference?


By ChuckDriver on 9/24/2009 4:27:36 PM , Rating: 3
quote:
Why should you care what type of connection the monitor uses?


http://en.wikipedia.org/wiki/HDCP

I still think that you make a good point. Being an enthusiast and making a little money from providing support, however, I do spend time looking at the back of the monitor. Generally, LCDs do a decent job with an analog signal, but there are annoying quirks. Booting between Windows and Linux, the edges off the desktop are cut off when using VGA because there are timing differences in the modeline each OS uses, despite being set to the same resolution. Fixing it is not difficult, just going two levels into the OSD to perform an auto-adjust, but it is annoying and is not a problem with DVI. Also, for me, text appears sharper with DVI.


RE: Nvidia are definitely on the back foot....
By afkrotch on 9/25/2009 5:09:55 AM , Rating: 2
It depends what you are going to be using the connection with.

- Can't be rocking a 30" monitor with an analog connection.
- No HD without DVI, HDMI, or DisplayPort. Not that anything even used HDCP yet.
- HDMI and DisplayPort carry audio, so you can have a single cable connection to your monitor. If you plan to use the speakers in the monitor.
- DisplayPort doesn't have a licensing fee, so in theory, monitor prices would be lower.

I don't have DisplayPort, so I can't really comment much on how it actually is, but I hate the hell out of HDMI. I connect my 360 up to my 26" Asus monitor via HDMI. I adjust my monitor's lean forward/back a lot and this minor movement is enough to cause the a loss in signal. The HDMI cable doesn't fall out or become disconnected, it's just the pins no longer touch.

These new cables are nice and small, but they annoy me. SATA does the same kind of crap. They need to work on better locking mechanisms for these things.


By Lazarus Dark on 9/27/2009 12:15:29 AM , Rating: 2
"No HD without DVI, HDMI, or DisplayPort. Not that anything even used HDCP yet."
Bluray on a PC requires HDCP, it won't play if you use VGA, unless you're using AnyDVD.
I think Bluray is probably the only use HDCP will ever have.


By JasonMick (blog) on 9/23/2009 2:51:33 PM , Rating: 4
Agreed -- with this generation AMD is yet again beating NVIDIA cards priced at $100 or more higher. Crossfire 5870 certainly is producing some drool-worthy results.

Now if AMD can only get their act together and release some mobile GPUs -- NVIDIA just released 5 new 40 nm mobile GPUs. With the improved power/heat envelope of these GPUs it seems silly if they *don't* try to jump deeper into the mobile market.


RE: Nvidia are definitely on the back foot....
By jonmcc33 on 9/23/2009 3:21:18 PM , Rating: 2
Pointless due to the netbook market (Intel 945G) and Apple using only nVIDIA chipsets. AMD's strength is with their own chipsets for their own processors.

I do believe that AMD makes some sort of money for Crossfire licensing on Intel chipsets though. I will always buy nothing but AMD/ATi anyway. No worry about the constant driver crashes or faulty GPUs that nVIDIA brings.

Of course these new 5870/5850 video cards are real surf boards. I think I'll wait for a 5830 if they are shorter and just Crossfire a pair.


RE: Nvidia are definitely on the back foot....
By Totally on 9/23/2009 3:46:29 PM , Rating: 2
didn't stop nvidia, as they are making inroads with ION, it shouldn't be too hard to create something comparable and get into bed with VIA.


RE: Nvidia are definitely on the back foot....
By dagamer34 on 9/23/2009 4:18:43 PM , Rating: 2
Problem is that the next gen Atom (Pineview) is going to have graphics integrated on die on ALL chips. That means that bundling an Atom CPU with a nVidia GPU is going to cost more than it does now and affect a manufacturer's bottom line if prices are not increased in tandem.


RE: Nvidia are definitely on the back foot....
By monomer on 9/23/2009 5:57:32 PM , Rating: 2
That's right, the integrated market will quickly be shrinking since the same thing will be happening on the rest of Intel's processors as well.


By Lerianis on 9/26/2009 1:36:06 AM , Rating: 2
Ah, but Intel is known to have CRAPPY graphics chips.... I'm not being harsh on them, I'm just telling the blunt truth there!


By ET on 9/24/2009 1:41:05 AM , Rating: 2
The 5850 is shorter than the 5870. Maybe it'd be short enough for you.


RE: Nvidia are definitely on the back foot....
By Fracture on 9/23/2009 3:40:57 PM , Rating: 2
Good point - I think the key factor here is the sector growth path each company has chosen.

Nvidia is clearly trying to capitalize on the growing netbook market while AMD is recapturing the "fastest card" title - a legacy crown from the emerging enthusiast market.

However, enthusiasts only make up a small percentage of actual purchases (but have greater influence on the purchases of others). DX11 is a good feature, but primarily benefits them - so AMD DID target their product line and timing appropriately. This will hurt them in the end by losing a greater marketshare in the fastest growing sector (laptops/netbooks). Let's hope they keep up with a quick 1-2-combo and release a new mobile set soon.


By afkrotch on 9/25/2009 5:24:24 AM , Rating: 2
I think AMD has learned that capturing the enthusiast market, helps capture more the mainstream market. If you can claim the performance crown, the mainstream market thinks that means all the cards below are also the fastest in their price range.


By carl0ski on 9/26/2009 2:20:43 AM , Rating: 2
NOT True

Notebook market want 4 count that 4 display output channels on their devices
Two internal LCD and two external screens.

This new device once ported to Mobile edition, will support that.
AMD/ATI is answering what the market wants. In order to increase market share in Notebooks.

However the marketing as it should is focusing on one feature DX11 as it is what people want at this point of time.


RE: Nvidia are definitely on the back foot....
By Screwballl on 9/23/09, Rating: -1
RE: Nvidia are definitely on the back foot....
By SerafinaEva on 9/23/09, Rating: -1
By Samus on 9/23/2009 5:20:06 PM , Rating: 5
yep, they just costs $500 bucks each :)


By Belard on 9/24/2009 6:14:02 AM , Rating: 5
So how do you explain how the X1900 series of cards were constantly faster than the GF7800 / 7900s?

Yes, Nvidia came back against the 9800Pro/XTs with the 6600/6800, but then ATI came out with the X1800s.

So no, it was when the 8800GTX came out that put the once $500 X1900XTX down several notches. Just like the $380 ATI 5870 is going to do some serious hurt on most of the GTX line, considering what those costs, the GTX 285 has just lost value... which was expected anyway.

The 4800s were never quite as fast as certain targeted GTX cards, but they were a lot cheaper and severely hurt Nvidia's bottom line.

Remember the GTX 280 at about $600 and the GTX 260 at $400? Then 2 weeks later, ATI stills their thunder with the 4850/4870 cards at $200/$300. Overnight, the GTX 280 dropped down to $400 ($250 for the GTX260)... but still people were buying the 4800s.

Then Nvidia spent the rest of the year re-badging 8800GT into many flavors of 9600~9800/GTS2 cards... what a mess!

Considering that the GTX2 (what happened to GTX1?) are still DX10 cards, THOSE should have been called 9800GTS/GTX and the G80 the 9600GT. It would have made more sense. Oh well.

Important notice about these 5870 benchmarks... these are new drivers for new cards for a new OS. So expect them to IMPROVE like any other GPU over the course of a year.


By JPForums on 9/24/2009 8:27:49 AM , Rating: 2
quote:
Actually Nvidia has always had the performance crown when it comes to single cards ever since the 6800 Ultra.


While the 6800 Ultra did beat the X800XT in many benchmarks, it also lost in many as well. You could arguably give it the edge, but this was by no means clear dominance. I would give the edge to the 7800GTX over the X1800XT, but nVidia couldn't really beat the X1900 series until the 8800 series came out.

Of course, nVidia has had the performance crown for the vast majority of the time since then. ATi didn't really get the crown back until the 4870x2 and then only for a short while. I suppose you can shake the results up a little if you include the 7950GX2 and the 3870x2, but in my opinion, neither of these cards performed consistently or stably enough to warrant single card consideration. Even the 4870x2 and GTX295 take flak at some review sites for lack of consistency, though stability seems to be good now.


RE: Nvidia are definitely on the back foot....
By Totally on 9/23/2009 3:50:36 PM , Rating: 5
quote:
'where the gtx 295 IS STILL faster than a single 5870.'

and a 5870x2 is not faster than a gtx 295?


By MrPoletski on 9/24/2009 4:32:15 AM , Rating: 2
It almost certainly will be, when it exists.

As a 4870x2 owner it's the only car that really interests me in this lineup. The 5870 is just not fast enough to warrant changing my 4870x2 out for. It's basically the same thing merged into one chip with DX11 features.

I don't know how true this is, but I heard the 5870x2 will be a multi-chip module, rather than two separate chips. Sideport will allow the two to talk to each other at high speed. I am hoping this means we can do away with having double-lots of memory (my 4870x2 has 2GB but it is virtually no more useful than a 4870x1's 1GB) by sharing memory access between the two chips. 5870x2 with 2GB of pooled memory that both chips can access simultaneously would just be WiN!.

Before I invest in any 58xx card though, I want to see how the GT-300 fares up (x2 will probably wait until this is about to come out anyway). Aside from anything else, new heated competition will make the ATi cards cheaper.

Meh, they will always be ATi as far as I'm concerned. Guess that makes me an old boy...


RE: Nvidia are definitely on the back foot....
By JPForums on 9/24/2009 7:51:03 AM , Rating: 1
quote:
the 4870 came out before the gtx 280 and the 4870x2 came out before the gtx 295


The GTX280/260 launched in June 2008 just days before the 4870/4850. I seem to remember a big deal about nVidia trying to crash ATi's party and ATi delaying the launch. ATi didn't get the single card crown until the very prompt release of the 4870x2 in August of 2008. The GTX295, however, wasn't released until early 2009.

quote:
amd may have had that advantage due to earlier releases


Even if the 4870 had launched before the GTX280, the short period of time between when one company updates their line up and the other follows suit is hardly enough to warrant saying the performance crown has changed hands anyways. However, the time period between the launches of the 4870x2 and the GTX295 was significant. ATi's promptness legitimately earned them the performance crown for a short period with the 4870x2.

quote:
the gtx 295 IS STILL faster than a single 5870


You are correct. While the 5870 is the fastest single gpu on the market, the GTX295 still holds the title of fastest single card. That said, the fastest multicard configuration most certainly goes to the 5870. Though, we'll have to wait for g300 from nVidia before we can justify a change of hands for the crown. This, of course, assumes nVidia will have g300 out in the next 2 months or so. As far as a 5870x2 goes, we'll just have to wait and see. It seems ATi will be dealing with the same issues that plagued the GTX295 this time (power and heat). So it may be a while in coming.


RE: Nvidia are definitely on the back foot....
By Targon on 9/24/2009 9:21:47 AM , Rating: 2
Not just out, NVIDIA needs to have their new parts available for sale. Paper launches really should not count.


By afkrotch on 9/25/2009 5:28:43 AM , Rating: 2
Tell that to ATI, not Nvidia. Nvidia hasn't done a paper launch for a while now.


RE: Nvidia are definitely on the back foot....
By The0ne on 9/23/2009 3:04:47 PM , Rating: 2
Already planning on getting one for my i7 build later this year. This card is looking good and although I don't think DX11 will be immediate any time soon I personally think it won't be a flop like DX10 :)


By akse on 9/24/2009 2:03:16 AM , Rating: 2
This will also be my new card.. the old 8800gts(320mb) is taking his last breaths at new games and I'll probably sell it for 50e for a friend to give him a nice boost to graphics :)

But I'll buy the 5870 later.. maybe december when the prices have drop a bit.


RE: Nvidia are definitely on the back foot....
By Parhel on 9/23/2009 3:14:04 PM , Rating: 2
quote:
How long has it been since they let ATi do something like this without some kind of distraction in the press?


It's totally out of character for Nvidia. No paper launch, no announcements, no driver releases. I wouldn't be surprised if they're sitting on another 5800 Ultra.


RE: Nvidia are definitely on the back foot....
By Amiga500 on 9/23/2009 3:16:28 PM , Rating: 2
I know...

The silence says more to me than anything else today. If any of you have Nvidia shares I would advise...

D-U-M-P!


RE: Nvidia are definitely on the back foot....
By SerafinaEva on 9/23/09, Rating: -1
By Amiga500 on 9/23/2009 3:36:20 PM , Rating: 5
Whats the weather like in Santa Clara these days?


By kroker on 9/23/2009 4:39:43 PM , Rating: 1
Nvidia, silent? Sorry, but that is just funny, considering all the badmouthing they did. Nvidia is the king of promising and hyping (CUDA, PhysX, "CPU doesn't matter", "open a can of whoop-ass" etc)

And recently, they said that their GT300 GPUs will be faster than the HD 5800 series, and that DX 11 won't drive video card sales. Isn't that promising and hyping right there?


By MrPoletski on 9/24/2009 4:38:48 AM , Rating: 2
I am skeptical that Nvidia has gone with a 'brand new architecture'. I'd be very suprised if it is anything other than just an evolution of the GT200 architecture.

They want people to think it's something brand new because that'll get people interested despite it possibly taking longer than it should to get to market.


By MrPoletski on 9/24/2009 4:35:37 AM , Rating: 2
quote:
The silence says more to me than anything else today. If any of you have Nvidia shares I would advise...

D-U-M-P!


so you can buy it all up for ten a penny? =)


RE: Nvidia are definitely on the back foot....
By kattanna on 9/23/2009 5:31:31 PM , Rating: 2
quote:
GT300 must be giving them real headaches...


i could have sworn i read somewhere within the past couple weeks, but can't find it now, that they are having horrid first wafer yields on their new chips, and possibly might have to do another stepping to correct such horrendous yield issues which would put it out another 2/3 months beyond what they have been telling people which was december time frame.

really wish i could find that article..


RE: Nvidia are definitely on the back foot....
By monomer on 9/23/2009 5:55:08 PM , Rating: 3
Charlie D (formerly from The Inq) reported on his site, Semiaccurate, that yields on the first batches for the GT300 were under 2%. Basically from four 300 mm wafers they were able to produce 7 good chips out of the possible 416.

http://www.semiaccurate.com/2009/09/15/nvidia-gt30...

Kyle at HardOCP later confirmed these numbers in a forum post, and some foreign sites like Chiphell were also reporting similar news from their sources.


By MrPoletski on 9/24/2009 4:43:17 AM , Rating: 2
Rememebr the G80? yeilds were coming out the toilet for that when they first spun it too.

This is merely why they are not talking about releasing the GT300 now. If yields were great they would be, but I doubt they expected their first yields to be any good anyway. Perhaps not as bad as that but certainly they will have taken into account the need to respin once or twice maybe thrice.


By MrPoletski on 9/24/2009 4:53:18 AM , Rating: 3
Incidently, from a 65nm 300mm wafer the GT200 got 94 chips.

Now, on a 40nm wafer, they are getting 104. So the die size is roughly 92% that of the GT200. The gate size has changed by 61.5%. so it would stand to reason that the number of transistors has changed by (roughly) 0.92/0.615=1.495 so roughly 50% more transistors.

gt200 was 1.4 billion, GT 300 looks like it's just over 2 billion. Word is throwing between 2.1 and 2.4 billion so can't be far out...


By kattanna on 9/24/2009 10:46:34 AM , Rating: 2
ahh.. yes. that was it.

thanks!


power consumption
By Souka on 9/23/2009 4:27:49 PM , Rating: 2
What's the power consumption and PSU requirements?

I see the idle posted above...but looking for load requirements




RE: power consumption
By achintya on 9/23/2009 4:34:21 PM , Rating: 3
RE: power consumption
By Souka on 9/23/2009 4:48:11 PM , Rating: 2
Ya...read a few of the reviews after posting this...heh.

My freebie GTX 285 isn't going anywhere yet..heh.

Big downer for me is the noise... come on ATI, open that exaust port up...let the card breath! Can't wait till PowerColor puts a cooler on it...they typicall do nice work.


Review with COD5?
By tallcool1 on 9/24/2009 1:00:48 PM , Rating: 2
Any reviews of the 5800 Series running COD World At War?




Dissapointing Benchmarks
By SerafinaEva on 9/23/09, Rating: -1
RE: Dissapointing Benchmarks
By Amiga500 on 9/23/2009 3:15:05 PM , Rating: 5
WHAT?!?!

Your aware the former Nvidia 'flagship' GTX280 couldn't stick with the 9800x2 when it was first launched. If you are on the Nvidia payroll, I would expect you were hoping no one would remember that.

New single GPU cards will virtually never beat a double GPU card of the previous generation unless the scaling of crossfire/SLI is rubbish. With double GPU cards being a valid option for the high end, a well designed GPU will always be designed small enough to fit 2 onto one card.

How do you think a 5870x2 will do against the GTX295?


RE: Dissapointing Benchmarks
By Parhel on 9/23/2009 3:15:46 PM , Rating: 4
quote:
Nvidia is going to come out with a chip that will easily trounce and double the performance here and retain the performance crown they've always had.


OK. When?


RE: Dissapointing Benchmarks
By SerafinaEva on 9/23/09, Rating: -1
RE: Dissapointing Benchmarks
By Totally on 9/23/2009 4:09:12 PM , Rating: 2
let's take a quick trip back to 2007

jan Dx10 publically available
may 2900 launch
sept 8800 launch release
nov first dx10 game available

I'm pretty sure they would have wanted to have their cards out before then


RE: Dissapointing Benchmarks
By Targon on 9/24/2009 9:33:54 AM , Rating: 2
Hmmmm, 5870 is faster than any other single GPU card out there, so how can you call this a mistake? There are a LOT of people out there who would buy the 5870 just for the performance in DirectX 8-10.1 alone, DX11 support just makes it a bit more future proof.

Keep in mind that if the 5870 were released with the same level of performance as the 4890, there would be very little reason for people to run out and buy it, but with the performance improvements, it is a great time to release it.

Now, no one has hard numbers or any evidence that NVIDIA will have a better performing card except when you look at physics processing, or maybe GPGPU stuff which very few people really care about. I don't know about you, but I would NEVER buy a video card based on how well folding@home works.

Unless NVIDIA can release a video card that is not only faster, but also costs an appropriate amount of money and has equal or better video quality than AMD/ATI, I will feel that the 5870 is the card to buy. Better performance but a worse image(which can be subjective) also doesn't mean much once you hit 60fps sustained with a good "lowest framerate".

If NVIDIA comes out with a quad GPU on a single card, will they be able to sell them for the $1000 such a card would cost? What if the next NVIDIA flagship part costs $600 and is dual-GPU, and the single-GPU part only provides a 2-5 percent performance benefit over the 5870 but also lacks many of the features of the 5870(built-in HDMI being very useful)?

Yes, DX11 isn't a reason to buy a new video card now, but for the performance, I am considering if I can afford a 5870 right now...


RE: Dissapointing Benchmarks
By nafhan on 9/23/2009 3:16:54 PM , Rating: 2
Still we're talking dual vs. single GPU. Plus, it's a smaller die than a single GTX 285/275 of which the GTX295 has two. So, much cheaper to produce.


RE: Dissapointing Benchmarks
By ClownPuncher on 9/23/2009 3:53:31 PM , Rating: 2
Judging by the minimum framerate edge that the 5870 has over the 295, I would say that it is very impressive.


RE: Dissapointing Benchmarks
By srp49ers on 9/23/2009 5:13:04 PM , Rating: 2
Plus the clocks on the 5870/5850 seem to be conservative. Im sure by the time nvidia comes out with their cards, Ati will release a "OC" version. Asus plans to have their voltage tweaked cards as well, which apparently clocks in at 1035c/5200m


RE: Dissapointing Benchmarks
By Kaleid on 9/25/2009 5:23:06 AM , Rating: 2
There will hopefully be faster GDDR5 memories available next year for a refresh.


RE: Dissapointing Benchmarks
By rs1 on 9/23/2009 4:11:51 PM , Rating: 3
quote:
Meh, the benchmarks aren't too impressive for a next gen flagship card.


Um...aside from the questionable logic of comparing a single-GPU card to a dual-GPU card, maybe you should read Ryan's article about the 58xx cards before you go posting nonsense.

To wit, the 5870 is *not* the flagship card. AMD's DX11 flagship card will be Hemlock-based, as you can see here:

http://www.anandtech.com/video/showdoc.aspx?i=3643...

Given that it should perform like two 5870's in CrossFire, it should easily trounce anything nvidia currently has, and provide decent competition for their next-gen offering.


RE: Dissapointing Benchmarks
By Kaleid on 9/25/2009 5:20:50 AM , Rating: 3
They haven't always had the crown. 9700 pro ruled for a long time (just as Nvidia did with G80).

And to have the fastest card means pretty much nothing since there is no huge demand for them.

What matters is price/performance ratio and Ati does very well here.


RE: Dissapointing Benchmarks
By Lerianis on 9/26/2009 1:33:31 AM , Rating: 2
True.... price/performance IS a reason why I won't look at NVidia right now... I got more power per dollar by going with ATI.


"If you look at the last five years, if you look at what major innovations have occurred in computing technology, every single one of them came from AMD. Not a single innovation came from Intel." -- AMD CEO Hector Ruiz in 2007














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki