backtop


Print 65 comment(s) - last by Lazarus Dark.. on Sep 27 at 12:15 AM


Comparison data from ATI (Click to expand)
ATI's roadmap shows the introduction of mainstream DX11 hardware

When ATI first launched the Radeon HD 4870 last year, it delivered a new level of performance at a great price point. NVIDIA struggled to compete, but still managed to hold onto a lot of its marketshare.

With the launch of Windows 7 and DirectX 11 on October 22, many video card buyers are already looking for DX11 cards in order be ready for the latest games and to future-proof their systems. ATI is giving the public that option by launching the Radeon 5800 series today.

The Radeon HD 5870 and the Radeon HD 5850 both use the same GPU core, previously codenamed Cypress. Comprised of 2.15 billion transistors, it has almost three times as many transistors as Intel's Core i7 CPU. The Radeon 5870 runs at a core clock of 850MHz, while the Radeon 5850 runs at 725MHz.

The Radeon 5870 has an impressive 1600 Stream processors and is capable of delivering up to 2.72 teraFLOPS. It is the most powerful single GPU video card currently available and will sell for around $379 at e-tailers. Meanwhile, the Radeon 5850 has 1440 Stream processors and will be priced around $259. Both cards are paired with 1GB of GDDR5 memory, and will use a dual slot design.

One of the more interesting things that ATI is doing is called Eyefinity. It allows the use of up to three monitors together through a single video card. All cards in the 5800 series will have two dual-link DVI ports, a HDMI port, and a DisplayPort. Since a TMDS transmitter is needed for each DVI or HDMI port, the use of a DisplayPort monitor or adapter is needed to use the third monitor. Eyefinity is capable of support three displays at a resolution of up to 2560x1600, but future models will support up to six monitors.

Power efficiency is something that the computer industry as a whole has been moving towards. The Radeon 5870 has an idle board power of 27 watts, compared to 90 watts with the Radeon HD 4870. This due to better power circuitry and a greater reduction in clock speed at idle. Unlike newer CPUs, it does not have the ability to shut down idle processors, but that is something that is inevitable in the future in GPUs.

ATI first introduced the 40nm process with the RV740 core used in the Radeon HD 4770. With the second generation process from TSMC, ATI's yields are quite high on the 40nm node. DailyTech has been told that the launch is "rock hard" and there will be products available today.

Being the first DX11 parts, there is a huge pent-up demand that will take several weeks to dissipate. Availability is therefore expected to be tight for the next two weeks. However, demand is expected to rise again around the October 22 launch of Windows 7.

Most video card manufacturers are expected to use the reference design to save costs, but some are already planning modifications to the cooler in order to enable better overclocking or less noise.

ATI has plans to ship more DirectX 11 parts in the near future. Hemlock combines two Cypress cores together into a Radeon HD 5870 X2 configuration, while Juniper will be the 5800 series equivalent of the Radeon 4830 selling at the $199 price point.

DirectX 11 hits the mainstream early next year with two mainstream products. It is likely to be known as the Radeon 5600 series comprised of the Radeon HD 5670 and Radeon HD 5650. Ultimately, the technology seen in the Radeon 5800 will move into mobile computers and integrated chipsets over the next year.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: Nvidia are definitely on the back foot....
By soundgy on 9/24/2009 7:55:30 AM , Rating: 2
Yes. They already have a few of these adapters out now. Just like the adapters used on the newest MacBookPros that have mini Display Port out instead of DVI


RE: Nvidia are definitely on the back foot....
By Hieyeck on 9/24/2009 8:33:06 AM , Rating: 2
What is this new displayport (Kidding)?! I just got done removing our final VGA monitor at work yesterday (Not kidding)!

Seriously, didn't DVI FINALLY take hold just a few years back? IIRC displayport and DVI have similar bandwidth too, why the shift? And how would adapters work, aren't they electrically incompatible?


RE: Nvidia are definitely on the back foot....
By soundgy on 9/24/2009 9:22:43 AM , Rating: 2
Unfortunately I do not know how they work (Fairy Dust!). But I do know that DisplayPort and miniDisplayPort can carry digital audio where DVI can't. And Display port has the mini port which is awesome for next generation graphics cards where they can have 6 ports lined up, yet only take up a single/double slot. Anyone else care to add?


By MikeMurphy on 9/24/2009 4:12:16 PM , Rating: 5
DisplayPort is packet based while DVI/HDMI are not.

It will allow all sorts of data to be transferred over the same cable that handles your video. Examples include audio, usb, and esata connectivity from your monitor without the need for any additional cables.

Its arguably what HDMI should have been.


RE: Nvidia are definitely on the back foot....
By ChuckDriver on 9/24/2009 9:42:11 AM , Rating: 2
quote:
I just got done removing our final VGA monitor at work yesterday (Not kidding)!


I've been doing some work at a place where someone bought some really cheap Dell Inspirons in the Fall of 2006. They are running what must be the last LCDs made with only an analog connection. This is frustrating when $30 dollars more would get the same monitor plus a DVI input. Unfortunately for them, they'll be dealing with these things well into the next decade.


RE: Nvidia are definitely on the back foot....
By Cypherdude1 on 9/24/2009 11:29:44 AM , Rating: 3
quote:
I've been doing some work at a place where someone bought some really cheap Dell Inspirons in the Fall of 2006. They are running what must be the last LCDs made with only an analog connection.

Why should you care what type of connection the monitor uses? If you're looking at the connection, you're looking at the wrong side of the monitor. You're supposed to look at the face of the monitor, not the back. Does it really matter whether it's connected via Analog RGB, DVI, or DisplayPort? Personally, as long as the picture is sharp, the refresh rate is high enough, and I can do the work, it doesn't make any difference how it's connected.

The same goes for the DirectX editions. Are there really any noticeable performance and quality differences between 9, 10, and 10.1? Side by side, would anyone be able to tell the difference?


By ChuckDriver on 9/24/2009 4:27:36 PM , Rating: 3
quote:
Why should you care what type of connection the monitor uses?


http://en.wikipedia.org/wiki/HDCP

I still think that you make a good point. Being an enthusiast and making a little money from providing support, however, I do spend time looking at the back of the monitor. Generally, LCDs do a decent job with an analog signal, but there are annoying quirks. Booting between Windows and Linux, the edges off the desktop are cut off when using VGA because there are timing differences in the modeline each OS uses, despite being set to the same resolution. Fixing it is not difficult, just going two levels into the OSD to perform an auto-adjust, but it is annoying and is not a problem with DVI. Also, for me, text appears sharper with DVI.


RE: Nvidia are definitely on the back foot....
By afkrotch on 9/25/2009 5:09:55 AM , Rating: 2
It depends what you are going to be using the connection with.

- Can't be rocking a 30" monitor with an analog connection.
- No HD without DVI, HDMI, or DisplayPort. Not that anything even used HDCP yet.
- HDMI and DisplayPort carry audio, so you can have a single cable connection to your monitor. If you plan to use the speakers in the monitor.
- DisplayPort doesn't have a licensing fee, so in theory, monitor prices would be lower.

I don't have DisplayPort, so I can't really comment much on how it actually is, but I hate the hell out of HDMI. I connect my 360 up to my 26" Asus monitor via HDMI. I adjust my monitor's lean forward/back a lot and this minor movement is enough to cause the a loss in signal. The HDMI cable doesn't fall out or become disconnected, it's just the pins no longer touch.

These new cables are nice and small, but they annoy me. SATA does the same kind of crap. They need to work on better locking mechanisms for these things.


By Lazarus Dark on 9/27/2009 12:15:29 AM , Rating: 2
"No HD without DVI, HDMI, or DisplayPort. Not that anything even used HDCP yet."
Bluray on a PC requires HDCP, it won't play if you use VGA, unless you're using AnyDVD.
I think Bluray is probably the only use HDCP will ever have.


"You can bet that Sony built a long-term business plan about being successful in Japan and that business plan is crumbling." -- Peter Moore, 24 hours before his Microsoft resignation














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki