backtop


Print 26 comment(s) - last by Mitch101.. on Feb 26 at 10:59 AM

Improved yield rates lead to new products

No manufacturing process is perfect; there will always be variations from the mean. For semiconductors, that means that there are chips that will go much faster than the design, or chips that just don't perform up to specifications. ATI is launching two new graphics cards that use the same chip, but at different ends of the spectrum.

Many gamers will remember the Radeon HD 4830 as a card that brought a lot of features to the table at a lower price point than its more powerful siblings. It was eventually eclipsed by the Radeon HD 4770 and phased out, but not before making a big impact on the market.

AMD's graphics division is hoping to do the same thing with the ATI Radeon HD 5830 graphics card launching today. However, production volume won't hit the channel until next week, which was the original launch target. The Radeon 5830 uses the same 
Cypress chip as the rest of the 5800 series, but has fewer SIMDs and ROPs units than its brethren. To compensate, the 5830 is clocked 75MHz higher than the 5850. Test cards have been able to hit 900MHz speeds and a memory clock of 1300MHz using ATI's Overdrive overclocking technology found in the Catalyst software drivers.

The performance of the Radeon 5830 is most similar to the Radeon HD 4890, which launched in April of last year. The 4890 currently fills a gap between the Radeon 5770 and the Radeon 5850, both in performance and pricing. However, it is built using an older 55nm process, making it larger and more expensive to produce than the 40nm 
Cypress chips, which have seen marked improvements in yield rates at TSMC.

Power consumption and fan noise are also much less with the 5830. The 5830 has a Thermal Design Power of 175 watts at load and 25 watts at idle, compared to the 4890's 190 watts at load and 60 watts at idle. The launch is being handled primarily by ATI's Add-In-Board partners like ASUS, Sapphire, and Gigabyte. Each company will use their own boards and cooling solutions, which will lead to greater price variation than is usual. There will also be some 5830 cards bundled with a Call of Duty: Modern Warfare 2 Bonus Pack. The Radeon 5830 is expected to sell for around $239.

“The ATI Radeon HD 5830 graphics card makes enthusiast-level performance even more accessible to gamers, adding another compelling choice to the award-winning ATI Radeon HD 5800 series,” said Matt Skynner, Vice President and General Manager of the AMD Graphics Division. “Cutting-edge features such as full DirectX 11 support, ATI Eyefinity multi-display capabilities and ATI Stream technology position the ATI Radeon HD 5830 graphics card to become a favorite with the gaming community.” 

The 5830 can support up to six monitors, but that feature is unlikely to be implemented by board partners. That role will fall to the Radeon HD 5870 Eyefinity-6 Edition, which will feature six mini-DisplayPort connectors and 2GB of GDDR5 memory. The company will include two mini-DisplayPort to DisplayPort adapters, two passive mini-DisplayPort-to-DVI (Single Link) dongles, and a passive mini-DisplayPort-to-HDMI dongle.

The TDP is increased to 228W due to the additional memory, which may also boost performance on single and dual monitor configurations. Both an 8-pin and a 6-pin power connector will be needed for the card. The ATI Radeon HD 5870 Eyefinity-6 Edition will launch on March 11.



 

Radeon HD 5870-E6

Radeon HD 5870

Radeon HD 5850

Radeon HD 5830

Radeon HD 5770

Radeon HD 4890

Stream Processors

1600

1600

1440

1120

800

800

Texture Units

80

80

72

56

40

40

ROPs

32

32

32

16

16

16

Core Clock

850MHz

850MHz

725MHz

800MHz

850MHz

850MHz

Memory Clock

1.2GHz (4.8GHz data rate) GDDR5

1.2GHz (4.8GHz data rate) GDDR5

1GHz (4GHz data rate) GDDR5

1GHz (4GHz data rate) GDDR5

1.2GHz (4.8GHz data rate) GDDR5

975MHz (3.9GHz data rate) GDDR5

Memory Bus Width

256-bit

256-bit

256-bit

256-bit

128-bit

256-bit

Frame Buffer

2GB

1GB

1GB

1GB

1GB

1GB

Transistor Count

2.15B

2.15B

2.15B

2.15B

1.04B

959M

TDP

228W

188W

151W

175W

108W

190W

Price Point

???

$399

$299

$239

$159

$199



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Hmmmmm
By Chapbass on 2/25/2010 10:12:08 AM , Rating: 4
Thoughts on 6 monitor support? I thought that, at least for gaming, 3 monitors was nice because the center of the screen (typically your character in a 3rd person game like wow, and your crosshair in an fps) was actually a full monitor, and the other 2 screens were primarily for the edges of your view.

Gaming with dual monitors caused that problem since the bezels were right in the middle. With 6 monitors, I can't think of a configuration where the center wouldn't be some sort of jumbled. I could see 5 monitors all horizontal for an extreme wide view, assuming that you could trick the game into displaying it right... Anyone elses thoughts?




RE: Hmmmmm
By Chapbass on 2/25/2010 10:13:39 AM , Rating: 1
SWEET i love terrible internet at work. gogo double post.


RE: Hmmmmm
By Scabies on 2/25/2010 11:18:34 AM , Rating: 2
In the case of WoW, I think the only useable solution would be to have screens 4-6 dedicated to Map, Objectives/Achievements, and perhaps either raidframes or all of your hotbars. Putting them below the main display means having an ultra sore neck, putting them above is like having your start menu drop from the top.

I think it is most important that the capability is there, and developers can come up with something interesting. Also, with proper armatron-izing, perhaps having a panorama of five screens with an un-eyefinitied sixth for something on your desktop is best (wowhead, vent, media player, etc).

(and I think I would go portrait mode on all five in that example, to get some extra height and save on the left-right area demands. would still be wonky.)


RE: Hmmmmm
By Targon on 2/25/2010 12:12:27 PM , Rating: 1
The Eyefinity edition was tested several months ago. The layout for a 6 way is normally with two high and three across. While I agree that the lines between monitors will be a distraction for some people, it will be acceptable to some.

Now, you can also expect that with the Eyefinity 6 setup, we will see a number of monitor vendors releasing special displays with a minimum of space around the displays to limit the space between monitors.

One other solution would be to take six panels with their own inputs and come up with a custom way to tie them together. This would allow for virtually no space between the six panels if done well. I would HOPE that some companies may even make a special Eyefinity display that would do just this. It would be a single monitor with six panels and six Display Port inputs on it to provide a series of 5760x2160 monitors of various sizes. As it stands now, 1920x1080 is fairly common, but a monitor with 1200 vertical lines isn't all that easy to find at a low price point.


RE: Hmmmmm
By Mitch101 on 2/26/2010 9:16:11 AM , Rating: 3
I play with triple monitors in eyefinity.

3 x 21.5" running at 1920x1080 or 5760x1080 on a Radeon 5770. Typically Left4Dead and Wow. Wow requires a bit of work to get all the popup display stuff right but other games you can get right into the action.

I dont find the bezel width to be an issue. I overlap my bezels to keep it to a minimum about 1/2" Its not like when you drive your car you have the pillar between your windows which blocks something from your view. There is no screen information behind the bezel portions because the screen continues on the next monitor. Sure it would be ideal not to have a bezel but its not blocking anything from your view.

What I do find is the side screens dont get depth of view correct. So there will be items on the side screen that appear much closer than what they really are. In Left4Dead you will see something in the side screen and react much sooner than you need to because when you put it to the center screen its actually farther away. Dont let this deter you from getting triple screens its a very different world in eyefinity and much much more fun personally. Supposedly ATI will be addressing some of this in the 10.3 catalyst drivers but even if they didnt I love eyefinity.

Eyefinity is the funnest upgrade I have done to my pc in a long time.


RE: Hmmmmm
By Aloonatic on 2/26/2010 2:47:11 AM , Rating: 2
6 monitors does seem crazy and wholly unnecessary for gaming. It's not like PC gaming isn't expensive enough. What detail can you run games on at that resolution (is that the right word for multiple screens?) too? Might you not be better off just buying a good large monitor, or even a good HD TV? Sure the resolution on the TV would be 1920x1080, but would it be that much worse than what you would get with 6 monitors, taking the gapps caused by the screen cases into account?

Anyway, Isn't the eyefinity thing aimed more at traders and bankers (etc) who need 3 or 4 screens to see how much of our money they are losing quickly and easily, whilst also keeping an eye on their bank account as it grows, and another screen for the local Ferrari garage to see what to blow their bonus on, given to them for not making a profit again, even when their government has printed money especially for them?

Also, if you're going to spend a lot of money on 6 screens, you're probably not going to be buying a middle of the range card like this?


RE: Hmmmmm
By Mitch101 on 2/26/2010 9:56:55 AM , Rating: 2
Size vs #Screens

I tried going from multiple monitors back to a single large display with high resolution. There are a few setbacks I ran into.

1-I also have a server having more than one display I can put the server up on the second display and watch it for popups/patch progress etc because as soon as you switch away you get a popup saying do you want to overwrite this file.

2-Development. Its a nice option having the second screen on a web page or applications you may be developing on. Generally in code development you can split screen with code and work window but it wants to split this horizontally so you wind up with 540 for display and 540 for code. Its OK and not fully solved with dual screens but helps in some areas like where I can have a Database open on the second screen while coding in the main screen.

3-All work and no play. With a second screen I can put a football game on in the second screen without sucking up space on my main screen. Allows me to work and play so I dont feel Im working all the time.

4-Training. I like Computer based training material. Having the training material running on a second screen while working/following along on the main screen works very well for me much more than watching a cbt then switching and trying to do it myself. Its like someone mentoring you over your shoulder.

5-Chats. Im a center of information in our IT department and get pinged constantly through chat/im sessions. Its nice being able to put my IM/Chats on a separate screen so they dont interfere with what I am doing.

6- Installs Another nice perk of a separate screen is for installs. If Im installing something from another screen I dont accidentally close or click on something when a popup occurs. This is rare but thought I would throw it in because it does happen.

7-boundries. One single large screen unless you use windows snap feature doesn't have zones. So when you maximize on a large display it takes up the whole display. Using snap on multiple monitors is like having 4 zones with two displays and 6 zones with 3 displays. Very handy in development.

Seriously I tried going to single monitor to cut on clutter but overall Im much more productive and relaxed having multiple screens. Sure there are plenty of times when multiple screens is clutter and not necessary and I do turn off my second/third display but for me multiple monitors works extremely well and is really necessary for the way I work.


RE: Hmmmmm
By Aloonatic on 2/26/2010 10:13:11 AM , Rating: 2
For productivity reasons, more than 1 screen is a pretty obvious win. I have been doing it for a while myself.

I was referring to using more than 1 screen in a gaming environment, where all the screens were essentially acting as 1 big screen, with a potentially massive resolution.


RE: Hmmmmm
By Mitch101 on 2/26/2010 10:39:14 AM , Rating: 2
ATI is working on an improved driver that switches between group and multi monitor mode. Right now its a little inconvenient to switch between but not too much pain switching between modes. I need to reboot which isn't bad.

Right now you create a group so it acts like one giant screen its funny when you maximize something and it spans your three monitors. Good for finding PR0N :) but not good for casual or development because it really needs boundaries.

Seriously though I love it. Has a few items to iron out that ATI is working on but I highly recommend it.

I rank it up there with the first time I got a VooDoo 3DFX card. Seriously changes your gaming.


RE: Hmmmmm
By Mitch101 on 2/26/2010 10:22:07 AM , Rating: 3
Sorry didnt answer your thoughts on price.

I picked off my 21.5" monitors from Staples for $102.00 each using a $25.00 off Coupon code bought on e-bay for .99 cents. I could have used my existing monitor for the center saving me the price of a third monitor but I decided my wife would like my previous screen. Around X-mas time some people got these monitors for $84.00 each.

$102.00 x 3 = $306.00 (About the price of a cheap 32" maybe?)

I use a Radeon 5770 I got for $150.00 and in a lot of games it will actually play at 5760x1080p smooth.

Total $456.00 or had I kept my previous monitor then $354.00.

Not exactly cheap depending on your income but defiantly worth the fun factor and productivity.


RE: Hmmmmm
By Aloonatic on 2/26/2010 10:33:08 AM , Rating: 2
That's pretty cool.

It's not something that I've ever dabbled in, outside of the office to be honest. I'm not much of a PC gamer these days, but at that price, it's not bad value, as long as your PC/Graphics card and the games you play support it of course.

I'm guessing that multiple screens would be good for even the WoW crowd, as well as flight sim types too? Probably more to it that I initially thought.


RE: Hmmmmm
By Mitch101 on 2/26/2010 10:59:19 AM , Rating: 2
Wow you have to customize a bit to get right because you map goes in the corner of the right screen without tweaks and if you have the show charachter health option on its too big at first but after the tweeks its awesome.

You really have to try this. The first few mins your not used to it but you quickly see the benefits and seeing something running up on you from the side adds a level of fun. Single monitor kind of feels like having blinders on now. I haven't been able to stomach a first person shooter in ages but this just brought me back to life.

The worst case scenario in games that require more GPU is that I would have to play them on a single screen. Hardly something I would complain about but today's video cards are so powerful its coming down to higher end video cards just add more eye candy and 1 higher resolution. I wouldn't hesitate to running triple screens at 1280x800 x 3 or 3840x800 instead of a single 1920x1080.

I would say buy it for the multi monitor productivity if you use CBT's, Being able to watch TV while working, development, and enjoy the bonus of widescreen gaming.


RE: Hmmmmm
By Visual on 2/26/2010 5:29:42 AM , Rating: 2
You are right that any setup with a bezel in the middle will be bad for a single game, but there are other options.
You can have a great surround experience with 1 row of 5 displays, plus 1 separate display for hud or widgets or something.
You can also have two groups of 3 monitors, and play two instances of some game.
Or even just play 6 instances without the need to alt-tab. LOL, I know it may sound absurd, but with games like eve-online this is actually feasible. I can even imagine each of the 6 displays in portrait mode divided in the middle and fitting total 12 eve online instances very comfortably.

Besides, I think this card would receive much more attention from artists, designers, engineers and other people that can make use of 6 displays for work rather than for games.


Not a Paper Launch, Ryan Smith (from Anandtech)
By tallcool1 on 2/25/2010 1:17:16 PM , Rating: 4
http://www.anandtech.com/video/showdoc.aspx?i=3750...
Ryan Smith from Anandtech wrote this in his review:
quote:
This brings up the other elephant in the room: today’s paper launch. Paper launches should by all means have died last year, but their ghost apparently continues to live on. If in fact no 5830s make it to retailers in time for today’s launch, then the card should not have been launched today – it’s as simple as that.
Whining about a paper launch when at worst the cards are to be available in less than a week...
HOWEVER , the cards are available NOW .
1 day after his review.

http://www.newegg.com/Product/Product.aspx?Item=N8...
http://www.newegg.com/Product/Product.aspx?Item=N8...

Shown in stock and available.

Will he have the guts to make a retraction?




RE: Not a Paper Launch, Ryan Smith (from Anandtech)
By Parhel on 2/25/2010 2:49:59 PM , Rating: 2
I doubt it. It's funny, I just posted a comment to the same effect on that article. I think their video card reviews have gone downhill since Anand and Derek were writing them. Nowadays, I only really care what HardOCP has to say . . .


RE: Not a Paper Launch, Ryan Smith (from Anandtech)
By Parhel on 2/25/2010 4:00:40 PM , Rating: 5
He updated the article, and even responded to my post. I think I was overly harsh, but I guess I hold Anandtech to a higher standard than most tech sites.


RE: Not a Paper Launch, Ryan Smith (from Anandtech)
By GTVic on 2/26/2010 4:25:27 AM , Rating: 2
Well, the guy deserved it I think. He went overboard with his criticisms on price and availability.


RE: Not a Paper Launch, Ryan Smith (from Anandtech)
By kroker on 2/26/2010 5:06:14 AM , Rating: 2
I'm not sure why they were misinformed about availability, but why do you think they went overboard with the criticism about price? I think they were completely right. HD5830 has both disappointing specs and disappointing performance. Maybe future drivers will improve performance somewhat, but not by much with all those disabled ROPs and texture units. ATI is selling defective chips (which would otherwise be thrown away) for more than they are worth, compared to their own previous generation offerings. Without competition, this is what we get.

I wasn't thinking about buying the card anyway, but it's not about that, it's about the precedent. ATI has been the major reason GPU prices have come down so much in the last two years, so I guess I expected more from them.

Oh well, at least I hope this extra cash will help AMD become more competitive on the CPU front as well. It seems they are on the right track. As for me, I'll hold on to my HD4850 for at least another year, no real need for a replacement yet.


By Parhel on 2/26/2010 9:35:30 AM , Rating: 2
quote:
I'm not sure why they were misinformed about availability, but why do you think they went overboard with the criticism about price?


I agree that it isn't priced attractively in AMD's lineup for the performance it delivers. But enthusiasts aren't the target audience for these cards. I'd either go down the line to the 5770 for power consumption, or up to the 5850/5870 for performance, depending on what I was building. But, yes, I think the criticism was a bit too harsh.

My thinking is that AMD has a limited supply of these chips, and intentionally isn't pricing them to sell like mad. These must have been stockpiled since before the launch of the 5870. As yields improve, these chips are sure to dry up. What they don't want to do is cannibalize the sales of higher end cards.

Maybe down the road we'll see an updated version similar to the GTX 260 core 216. Who knows.


By The0ne on 2/25/2010 6:16:28 PM , Rating: 2
Anand has proven time and time again to be not as informative and at times a fan-boy, as in Anand's Apple loving articles.

I trust xbit-labs more now than I do Anand for more reasons than listed above. Here's a quote from xbit's conclusion,

quote:
On the other hand, we won’t voice our final verdict about the Radeon HD 5830 because the new card still has no official drivers...


So now make your call on the numbers from drivers used.


Die size vs Cost.
By StevoLincolnite on 2/25/2010 10:12:57 AM , Rating: 3
quote:
The 4890 currently fills a gap between the Radeon 5770 and the Radeon 5850, both in performance and pricing. However, it is built using an older 55nm process, making it larger and more expensive to produce than the 40nm Cypress chips


Well considering that the RV770's die size is 260mm2 and Cypress is 334mm2...
Shouldn't Cypress be the larger and thus more expensive one to produce? Despite it being built on a smaller manufacturing process?

Not to mention the older manufacturing process is more mature, and yields would be better than the current 40nm process which would also assist in lowering costs as you can get more usable parts out of a single wafer.




RE: Die size vs Cost.
By nafhan on 2/25/2010 11:01:46 AM , Rating: 2
You're right, the reason for the cost of the 5830's isn't the manufacturing process. The 5830's priced where it is because it's made from rejected 5870/50 dies. The 5830 will probably disappear once yields reach a certain level.
I wonder if they're planning to test the 32nm node (or whatever's next for TSMC) on a 5830 replacement - like what they did with the 4830/4770?


RE: Die size vs Cost.
By nafhan on 2/25/2010 11:02:50 AM , Rating: 3
Looks like TSMC is moving from 40nm to 28nm.


Hmmmmm
By Chapbass on 2/25/2010 10:11:48 AM , Rating: 2
Thoughts on 6 monitor support? I thought that, at least for gaming, 3 monitors was nice because the center of the screen (typically your character in a 3rd person game like wow, and your crosshair in an fps) was actually a full monitor, and the other 2 screens were primarily for the edges of your view.

Gaming with dual monitors caused that problem since the bezels were right in the middle. With 6 monitors, I can't think of a configuration where the center wouldn't be some sort of jumbled. I could see 5 monitors all horizontal for an extreme wide view, assuming that you could trick the game into displaying it right... Anyone elses thoughts?




This announcement
By Drag0nFire on 2/25/2010 10:22:13 AM , Rating: 2
was such a wonderful start for my day.

Price of the 5770 is already being driven down (see today's NewEgg deals). Or I could get the 5830 which is even better!




Eyefinity, bigger purpose
By knutjb on 2/25/2010 7:59:38 PM , Rating: 2
Think about industries who use lots of monitors and don't require FirePro/Quattro grade cards, just lots of screens. This could become a major cash cow for AMD/ATI. Fewer machines with a couple cards versus current options.




"It seems as though my state-funded math degree has failed me. Let the lashings commence." -- DailyTech Editor-in-Chief Kristopher Kubicki














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki