Print 65 comment(s) - last by Lazarus Dark.. on Sep 27 at 12:15 AM

Comparison data from ATI (Click to expand)
ATI's roadmap shows the introduction of mainstream DX11 hardware

When ATI first launched the Radeon HD 4870 last year, it delivered a new level of performance at a great price point. NVIDIA struggled to compete, but still managed to hold onto a lot of its marketshare.

With the launch of Windows 7 and DirectX 11 on October 22, many video card buyers are already looking for DX11 cards in order be ready for the latest games and to future-proof their systems. ATI is giving the public that option by launching the Radeon 5800 series today.

The Radeon HD 5870 and the Radeon HD 5850 both use the same GPU core, previously codenamed Cypress. Comprised of 2.15 billion transistors, it has almost three times as many transistors as Intel's Core i7 CPU. The Radeon 5870 runs at a core clock of 850MHz, while the Radeon 5850 runs at 725MHz.

The Radeon 5870 has an impressive 1600 Stream processors and is capable of delivering up to 2.72 teraFLOPS. It is the most powerful single GPU video card currently available and will sell for around $379 at e-tailers. Meanwhile, the Radeon 5850 has 1440 Stream processors and will be priced around $259. Both cards are paired with 1GB of GDDR5 memory, and will use a dual slot design.

One of the more interesting things that ATI is doing is called Eyefinity. It allows the use of up to three monitors together through a single video card. All cards in the 5800 series will have two dual-link DVI ports, a HDMI port, and a DisplayPort. Since a TMDS transmitter is needed for each DVI or HDMI port, the use of a DisplayPort monitor or adapter is needed to use the third monitor. Eyefinity is capable of support three displays at a resolution of up to 2560x1600, but future models will support up to six monitors.

Power efficiency is something that the computer industry as a whole has been moving towards. The Radeon 5870 has an idle board power of 27 watts, compared to 90 watts with the Radeon HD 4870. This due to better power circuitry and a greater reduction in clock speed at idle. Unlike newer CPUs, it does not have the ability to shut down idle processors, but that is something that is inevitable in the future in GPUs.

ATI first introduced the 40nm process with the RV740 core used in the Radeon HD 4770. With the second generation process from TSMC, ATI's yields are quite high on the 40nm node. DailyTech has been told that the launch is "rock hard" and there will be products available today.

Being the first DX11 parts, there is a huge pent-up demand that will take several weeks to dissipate. Availability is therefore expected to be tight for the next two weeks. However, demand is expected to rise again around the October 22 launch of Windows 7.

Most video card manufacturers are expected to use the reference design to save costs, but some are already planning modifications to the cooler in order to enable better overclocking or less noise.

ATI has plans to ship more DirectX 11 parts in the near future. Hemlock combines two Cypress cores together into a Radeon HD 5870 X2 configuration, while Juniper will be the 5800 series equivalent of the Radeon 4830 selling at the $199 price point.

DirectX 11 hits the mainstream early next year with two mainstream products. It is likely to be known as the Radeon 5600 series comprised of the Radeon HD 5670 and Radeon HD 5650. Ultimately, the technology seen in the Radeon 5800 will move into mobile computers and integrated chipsets over the next year.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

RE: Nvidia are definitely on the back foot....
By Cypherdude1 on 9/24/2009 11:29:44 AM , Rating: 3
I've been doing some work at a place where someone bought some really cheap Dell Inspirons in the Fall of 2006. They are running what must be the last LCDs made with only an analog connection.

Why should you care what type of connection the monitor uses? If you're looking at the connection, you're looking at the wrong side of the monitor. You're supposed to look at the face of the monitor, not the back. Does it really matter whether it's connected via Analog RGB, DVI, or DisplayPort? Personally, as long as the picture is sharp, the refresh rate is high enough, and I can do the work, it doesn't make any difference how it's connected.

The same goes for the DirectX editions. Are there really any noticeable performance and quality differences between 9, 10, and 10.1? Side by side, would anyone be able to tell the difference?

By ChuckDriver on 9/24/2009 4:27:36 PM , Rating: 3
Why should you care what type of connection the monitor uses?

I still think that you make a good point. Being an enthusiast and making a little money from providing support, however, I do spend time looking at the back of the monitor. Generally, LCDs do a decent job with an analog signal, but there are annoying quirks. Booting between Windows and Linux, the edges off the desktop are cut off when using VGA because there are timing differences in the modeline each OS uses, despite being set to the same resolution. Fixing it is not difficult, just going two levels into the OSD to perform an auto-adjust, but it is annoying and is not a problem with DVI. Also, for me, text appears sharper with DVI.

RE: Nvidia are definitely on the back foot....
By afkrotch on 9/25/2009 5:09:55 AM , Rating: 2
It depends what you are going to be using the connection with.

- Can't be rocking a 30" monitor with an analog connection.
- No HD without DVI, HDMI, or DisplayPort. Not that anything even used HDCP yet.
- HDMI and DisplayPort carry audio, so you can have a single cable connection to your monitor. If you plan to use the speakers in the monitor.
- DisplayPort doesn't have a licensing fee, so in theory, monitor prices would be lower.

I don't have DisplayPort, so I can't really comment much on how it actually is, but I hate the hell out of HDMI. I connect my 360 up to my 26" Asus monitor via HDMI. I adjust my monitor's lean forward/back a lot and this minor movement is enough to cause the a loss in signal. The HDMI cable doesn't fall out or become disconnected, it's just the pins no longer touch.

These new cables are nice and small, but they annoy me. SATA does the same kind of crap. They need to work on better locking mechanisms for these things.

By Lazarus Dark on 9/27/2009 12:15:29 AM , Rating: 2
"No HD without DVI, HDMI, or DisplayPort. Not that anything even used HDCP yet."
Bluray on a PC requires HDCP, it won't play if you use VGA, unless you're using AnyDVD.
I think Bluray is probably the only use HDCP will ever have.

“We do believe we have a moral responsibility to keep porn off the iPhone.” -- Steve Jobs

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki