backtop


Print 26 comment(s) - last by Cypherdude1.. on Feb 8 at 8:11 PM


(Click to enlarge)
Discrete DirectX lineup now complete

We enjoy a very good lifestyle in North America and Europe, but most people around the world can't afford to shell out hundreds of dollars for a new video card. That's why the majority of ATI's video cards sales are less than $100.

ATI is launching the Radeon HD 5450 for those people. The card uses ATI's newest 40nm chip from TSMC, codenamed Cedar. The GPU is a mere 59mm2, enabling the Radeon 5450 to become the first DirectX 11 video card to sell for under $50. It will use DDR3 or DDR2, depending on the Add-In Board partner who sells the card. 512MB versions will sell for around $50, while 1GB versions will sell for around $60.

The new card also targets another important market. The Home Theatre PC segment is growing, and users there generally desire a passive cooling solution. The Radeon 5450 has a Thermal Design Power of only 19.1W, and an idle power usage of 6.4 watts. This enables the use of a passive cooling solution. The reference card shown uses a heatsink that occupies two slots, but AIBs will mostly be selling single slot solutions.

The Radeon 5450 has very similar specifications to the Radeon HD 4550, but adds support for ATI's Eyefinity multi-monitor support. 24 inch monitors are now available around the $200 mark, and smaller screen sizes can be had for even less. Other features of note to HTPC enthusiasts are 8-channel LPCM audio and bitstreaming audio support.

The graphics division of AMD  has a launched complete top to bottom discrete DirectX 11 product line, while NVIDIA hasn't launched a single DX11 card yetCypress, Juniper, Redwood, and Cedar will form the backbone of ATI's discrete sales for the next six months, and any new products will be based around those chips and their respins.

DirectX 11 is here to stay, and developers are working on the software to take advantage of it now that there is an established customer base.


 

ATI Radeon HD 5770

ATI Radeon HD 5750

ATI Radeon HD 5670

ATI Radeon HD 5450

Stream Processors

800

720

400

80

Texture Units

40

36

20

8

ROPs

16

16

8

4

Core Clock

850MHz

700MHz

775MHz

650MHz

Memory Clock

1.2GHz (4.8GHz data rate) GDDR5

1.15GHz (4.6GHz data rate) GDDR5

1000MHz (4000MHz data rate) GDDR5

800MHz (1.6GHz data rate) DDR3

Memory Bus Width

128-bit

128-bit

128-bit

64-bit

Frame Buffer

1GB

1GB / 512MB

1GB / 512MB

1GB / 512MB

Transistor Count

1.04B

1.04B

627M

292M

TDP

108W

86W

61W

19.1W

Price Point

$179

$149 / $129

$119 / $99

$49 / $59



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

hmmm....
By EasyC on 2/4/2010 9:45:53 AM , Rating: 2
That looks like it will fit into the small form factor with the right back plate. If this manages to outperform my current slim 4650, I'll be upgrading my HTPC!




RE: hmmm....
By Jansen (blog) on 2/4/2010 9:50:48 AM , Rating: 3
Wait a week...that's all I can say for now.


RE: hmmm....
By SilentSin on 2/4/2010 2:17:06 PM , Rating: 3
Looks like someone beat you to the punchline...
http://www.erodov.com/forums/ati-launches-radeon-h...

Basically seems like a 5670 using DDR3 and power consumption improvements. Half-height PCB is nice too, don't recall any 5670's going that route. Looks bandwidth starved but otherwise that card will stand in a league of its own for HTPC use if it can come in around the $75-85 range.


RE: hmmm....
By Cypherdude1 on 2/8/2010 8:11:40 PM , Rating: 2
I wonder if you can convert that VGA connector to a DVI for a dual-DVI. I assume the center connector is an HDMI. Can you convert the HDMI to a S-Video?

I currently have an older ATI Radeon 9550 with a VGA, DVI, and S-video out. It also has passive cooling and uses one slot. I am content with it because I have a 25 foot S-video cable leading to my TV which I use when playing DVD's. ATI should put more effort into creating single slot, passive cooling, dual-DVI + HDMI/S-Video, video cards. They are perfect for dual use, office/entertainment environments. Having a loud fan while working or playing DVD's is unacceptable. A card which uses 2 slots can be problematic.


RE: hmmm....
By Spuke on 2/4/2010 10:03:38 AM , Rating: 4
quote:
If this manages to outperform my current slim 4650, I'll be upgrading my HTPC!
Dude, check out this link. HD audio bitstreaming!! That alone is worth the upgrade.

http://www.avsforum.com/avs-vb/showthread.php?t=11...


RE: hmmm....
By blaster5k on 2/4/2010 10:21:08 AM , Rating: 2
Is there really much advantage to outputting Dolby TrueHD or DTS HD over 7.1 PCM? I heard it means you have to forgo mixing of other sounds.

I ask just out of curiosity, since my receiver doesn't support those codecs anyway.


RE: hmmm....
By Spuke on 2/4/2010 12:04:20 PM , Rating: 2
quote:
Is there really much advantage to outputting Dolby TrueHD or DTS HD over 7.1 PCM?
The main advantage, IMO, is to allow your receiver (preamp, DAC, etc.) to do the decoding instead of the computer/video. Depending on the quality of the receiver, it might be a better decoder than the computer/video. Not to mention, it allows you to choose the decoder instead of it being chosen for you.


RE: hmmm....
By omnicronx on 2/4/2010 12:36:35 PM , Rating: 3
No.. thats incorrect. By nature LPCM cannot be decoded as it is already uncompressed audio.

You can't think of DTS-HD and DDTrueHD like previous compressed codecs, they are essentially just containers (like a zip file) which contain uncompressed LPCM audio.

Basically the player either passes LPCM directly to the receiver, or it passes DTS-HD or DDTrueHD to the receiver which is also decoded to LPCM audio. Either way you end up with LPCM audio, its just how it gets there.

As I stated in my other post, the real difference comes down to what you receiver is capable of. One of my receivers for example supports LPCM, but does not support post processing such as bass management etc. That being said, I would rather hear the track as it was intended, you should not have to modify completely uncompressed audio, it should sound great the way the person who mastered it intended.

In my opinion DTS-HD/DDTrueHD actually have a bigger disadvantage, and thats the ability to mix other tracks into the signal (for example for extra features or internet downloads), this is not possible with these codecs as the signal must be left untouched until it reaches the receiver.


RE: hmmm....
By Spuke on 2/4/2010 1:43:41 PM , Rating: 2
quote:
You can't think of DTS-HD and DDTrueHD like previous compressed codecs, they are essentially just containers (like a zip file) which contain uncompressed LPCM audio.
Thanks for the explanation. Let me ask a question to make sure I really understand this, so all the receiver does is pass DTS-HD and DDTrueHD audio to a DAC for analog output. If this is true then the real benefit is unmolested audio until the DAC stage?

quote:
In my opinion DTS-HD/DDTrueHD actually have a bigger disadvantage, and thats the ability to mix other tracks into the signal (for example for extra features or internet downloads), this is not possible with these codecs as the signal must be left untouched until it reaches the receiver.
Would the mixing occur on the HTPC or the receiver? And if it's on the receiver can you prevent other data from mixing with this data? Sorry for the questions, I was obviously confused before on how this worked.


RE: hmmm....
By omnicronx on 2/4/2010 3:16:10 PM , Rating: 2
quote:
Thanks for the explanation. Let me ask a question to make sure I really understand this, so all the receiver does is pass DTS-HD and DDTrueHD audio to a DAC for analog output. If this is true then the real benefit is unmolested audio until the DAC stage?
Very rarely these days is any signal left completely unmolested by your multichannel receiver. In reality the benefit has do with what you receiver/player is capable of, if done correctly there really is no benefit to using bitstream over LPCM. That being said, every receiver does things differently, so if there is a difference to be heard it has more to do with the hardware and its inefficiencies, not LPCM or DTSMA/DDTrueHD itself.

quote:
Would the mixing occur on the HTPC or the receiver? And if it's on the receiver can you prevent other data from mixing with this data? Sorry for the questions, I was obviously confused before on how this worked.
On the player end, which is why it cannot be done with via bitstream as the player is unable to mix the track. LPCM is linear, adding another track should have absolutely no affect on the original audio stream.(if you were wondering). The original audio stream is still left untouched just as with advanced codecs.

In the end it comes down to personal preference, if you want to be completely future proof, and have the ability to play higher sampled audio similar to what is currently found on SACD's and DVDA's then TrueHD/DTSMA is the way to go (i.e if you are a true audiophile) otherwise you most likely won't notice the difference, and if you do it could be for many reasons. For example on my receiver TrueHD/DTSMA streams have some kind of gain(most likely performed by my receiver), it is a bit louder than LPCM, so without adjusting the volume and performing a listening test, TrueHD/DTSMA does sound slightly better. Now of course as soon as I compensate by raising the volume a tiny bit each time I switch back and forth, there is no audible difference.

In the end its up to personal opinion, go do some sound tests and figure out what sounds the best to you. In the end thats all that matters. Its kind of like arguing over semantics, either way its uncompressed audio, which surpasses anything we've heard before. A large percentage of the population would claim they cannot hear the difference between the ipod headphones and a good set of headphones, so are we really suppose to believe that a large percentage of people can hear the difference between two uncompressed audio sources?


RE: hmmm....
By Spuke on 2/4/2010 4:28:08 PM , Rating: 2
quote:
In the end its up to personal opinion, go do some sound tests and figure out what sounds the best to you.
Thanks much for the info. I'll have the opportunity to do some testing during the next two weeks so this will be a bit clearer then.


RE: hmmm....
By mcnabney on 2/4/2010 4:39:18 PM , Rating: 3
Pay attention to the above post.

Current, and likely all future BluRay software for the PC does in fact downgrade the audio sampling rate when it translates the Bitstream into LPCM before outputing to the receiver. If you send the Bitstream intact, which requires PAP (protected Audio Path) that this card supports, your receiver will play the audio as it was intended. This is done to 'protect' the audio track in the new HD media.
When I switched from my 4850 (LPCM only) to 5850 (LPCM and Bitstream) I noticed the difference IMMEDIATELY. It is not as noticeable as going from 128kb/s MP3s to FLAC, but the improvement is there. Plus, your receiver - if it can decode the HD tracks - will now light up the TrueHD and DTS Master lights when playing.


RE: hmmm....
By omnicronx on 2/4/2010 5:21:00 PM , Rating: 2
This is very true, have to remember this is an article on an ATI card ;)

The same rules don't apply when playing from PC software, LPCM 7.1 is downsamped to 16bit/48khz, so you are definitely correct in saying your new card would sound better.(as you explained this is a DRM protection scheme.)

Sorry Spuke, if you are talking about an HTPC, bitstream is most likely the way to go


RE: hmmm....
By omnicronx on 2/4/2010 12:21:33 PM , Rating: 2
Little to no difference now as movie studios don't record their audio masters over 24bit/96khz, you can find some audio discs that do but that's about it.(in fact most movies are not even that high). Other than that right now the streams are usually pretty much identical, and remember any way you do it, your receiver still ends up with LPCM data, think of DTSHD and TrueHD as zip file containers, inside that container is still uncompressed audio.

That being said, I'm pretty sure LPCM does not support multichannel audio up to 24bit/192KHZ like DDTrueHD and DTSHD. Believe it or not, most LPCM BD's are sampled at 24bit/48KHZ, you always have to read the back of the BD to see what the audio was sampled at. Now whether or not you can tell the difference between one uncompressed source and another is a totally different issue. (especially considering as you get older the high and the low end frequencies start to fade) I for one can tell the difference, but I can still hear over 20khz with ease.

Bitstream proponents will also try and claim that LPCM suffers from jitter that the two bitstream formats do not, but thats just a load of BS and has been disproved many times.

Also because of the nature of LPCM, many receivers can't do certain post processing, such as matrix-ed 5.1 audio to 7.1 audio via prologic or something similar.

So basically until the studios play a little catchup, LPCM is more than enough for 99% of the population..(I'm only saying this because some audiophiles would disagree, but its more than likely they are just trying to validate their purchase).
That because said, because LPCM always will be part of the BD spec (i.e its required in every BD movie whether or not DDTHD or DTSMA exist or not) so you will never need anymore than LPCM to watch BD movies.


RE: hmmm....
By blaster5k on 2/4/2010 2:30:25 PM , Rating: 2
That's about what I figured. It mostly comes down to what kind of post-processing the receivers can do versus the PC then.

I have a 5.1 surround setup, so I have wondered what happens to the other two channels when movies use 7.1. Am I losing sound from the rear that should be mixed with my side channels? It's not really clear how that ends up getting handled. (I'm using ArcSoft TMT 3 with a GeForce 9400 motherboard)


RE: hmmm....
By omnicronx on 2/4/2010 3:42:01 PM , Rating: 2
quote:
Am I losing sound from the rear that should be mixed with my side channels?
As long as you setup your receiver and/or player correctly, i.e there should be a spot in your receiver settings to say how many speakers are connected, it should downmix from a 4 channel rear setup to 2. i.e you should not be losing any sound.

Once again though this may vary from receiver to receiver, and would most likely be the same whether you used DTSMA/TrueHD or LPCM.


RE: hmmm....
By BruceLeet on 2/4/2010 10:53:41 AM , Rating: 2
Im a very budget gamer being in College and on my own, I bought the HD 4670 last spring for $109 and I can play the games I like mainly CoD4/MW2 with 60+ with everything on HIGH or EXTRA. I know MW2 maxes out at 75FPS because that's what Fraps reports, and thats my screens refresh rate.

I'm just curious about this 5670, wondering if the investment would be worth it. DX11 is the biggest draw seeing as I'm running on W7 64-bit.


RE: hmmm....
By nafhan on 2/4/2010 11:14:20 AM , Rating: 2
http://www.anandtech.com/video/showdoc.aspx?i=3720

If what you have is doing fine, then you don't need to upgrade :)
In your situation, I'd wait until the price drops a bit, and grab a 5750/70.


RE: hmmm....
By Parhel on 2/4/2010 1:32:32 PM , Rating: 4
That's totally not worth it. If you can already play everything you want to play at max settings, then any upgrade is a waste of money.

Buying the latest low end card every 6 months to a year is the worst way to upgrade . . . especially if your on a tight budget. Just buy one high end card, and be happy with it for 2 - 3 years.

I've managed to upgrade every three years and never want for performance:

1999 - TNT2
2002 - 9700Pro
2005 - X800XL
2008 - 8800GT

For you, those might look like huge jumps, but I only upgraded when there was a game I couldn't play well. When I couldn't play Neverwinter Nights, I bought the 9700Pro. When I bought Oblivion and it would barely run, I bought the 8800GT.

My gaming budget is small as well, and in my opinion just save your money this round, and buy something nice when you can afford it.


RE: hmmm....
By dark matter on 2/7/2010 6:11:36 AM , Rating: 2
I do exactly the same myself.

Interestingly I followed the same upgrade path as yourself. The 9800Pro(ATI) was a fantastic card wasn't it. Until Oblivion came along. It was fine until more than two enemies were on screen at once.

The 8800GT - Another stirling card. I believe this card still lives today in some form (I have lost track of the renames this card has), but when it came out the bang for buck was absolutely amazing. (almost as good as the 8800GTX!!)

It's nearly time for my latest upgrade. Just waiting until Autumn (Fall) and see what games are coming out (Usually the best games come out around this time, as well as the best cards).

Looking forward to playing a few of the games I bought this year again with my new card. :)



RE: hmmm....
By nafhan on 2/4/2010 11:06:50 AM , Rating: 2
For HT, it will be better due to lower power usage and more advanced features. For gaming, it'll be worse. It's got 1/4 the number of shaders and much less memory bandwidth. So, I'd say either look for a low profile 5670 (which they will probably make eventually if they don't already), or keep what you got.


Pretty amazing.
By namechamps on 2/4/2010 9:34:32 AM , Rating: 3
I won't be buying one but it is phenomenal the amount of processing power available in a $50 (or $100) package these days.

I wonder if it makes more sense to use multiple low end GPU or one high end GPU when using them for general purpose calculations (i.e. number crunching).

What gets you the best $ per TFLOP?




RE: Pretty amazing.
By slashbinslashbash on 2/4/2010 10:31:02 AM , Rating: 4
If what you need is raw compute power, it's not hard to see what card gives the most bang per buck. It's the shaders that do the heavy lifting, and just from the chart in this article you can see that the 400 shaders in the 5670 cost only twice as much as the 80 shaders in the 5450. That's 5x the performance for 2x the price. Not to mention the 5670 runs at a 20% higher clockspeed and has 5x the memory bandwidth of the 5450. Not a bad trade-off. And then it's less than 2x the price to more than double the performance (2x the shaders, at a 10% clock increase) from the 5670 to the 5770.

However, the 5870 is more than double the price of the 5770 and gives 2x the shaders (so in this case, it's not worth it) but the 5970 again doubles theoretical performance from the 5870 at 1.5x the price. The 5850 is 1.5x the performance of the 5770 at 1.67 the price. So in ATI's lineup, you the most bang for your buck with the 5770. They need to lower the prices of the 58xx series to at least put it in line, price to performance wise, with the 5770. I bet that we'll see a 58xx price drop when Nvidia releases their new chips.


Hm
By bradmshannon on 2/4/2010 2:04:50 PM , Rating: 2
How much better would this $50 card be then my 8600GT? :)




RE: Hm
By amanojaku on 2/4/2010 4:47:14 PM , Rating: 2
Not at all, if you're looking at frame rates. According to Jansen the 5450 is similar to the 4550. The 8600 GT beats the 4550 on all the benchmarks I've seen, in many cases doubling the frame rate. That's not a surprise since the 4550 is entry level, while the 8600 is yesterday's mainstream card. The 5450 gives less performance than the 4550, according the link below.

If you want to lower your power the 5450 uses half of what the 8600 GT uses. But you didn't care about that, did you? ;-)

http://www.anandtech.com/video/showdoc.aspx?i=3734


5830
By plopke on 2/5/2010 10:34:24 AM , Rating: 2
;o what happend to the 5830 , was deleayed until 5th. But nothing?




"Well, there may be a reason why they call them 'Mac' trucks! Windows machines will not be trucks." -- Microsoft CEO Steve Ballmer














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki