backtop


Print 47 comment(s) - last by Darkskypoet.. on Feb 26 at 6:46 PM


RV740  (Source: Guru of 3D)

  (Source: Guru of 3D)
Benchmarks place it comfortably between the HD4830 and HD4850

Despite an uncharacteristically tight-lipped stance from ATI on its first 40nm parts, hardware review website Guru of 3D has managed to secure a sample of an RV740-based graphics card and put it through the paces. At first glance, the specifications suggest that it is little more than a shrunken RV770LE core, but a closer look reveals several improvements

According to the preview published today, the GPU boasts 640 shaders, 32 TMUs, and 16 ROPs; all the same as the RV770LE. However, the core and memory clocks both receive a significant bump, up from 575MHz to 650MHz and 1800MHz GDDR3 to 3200MHz GDDR5 respectively. The substantial boost in memory frequency works to offset the performance difference incurred by the smaller 128-bit memory bus on the RV740, which brings the reported math processing rate for the GPU up to 900 GigaFLOPS.

Every single benchmark result published in the review places the RV740-based card right between the existing Radeon HD4830 and HD4850 graphics cards, also from ATI. For example, at 8xMSAA and 16xAF in the popular first person shooter Left 4 Dead, the RV740 turns in 28 FPS, flanked by scores of 25 and 30FPS from the two cards at 2560x1600 resolution. The card does exhibit a significant performance drop at higher resolutions, likely the result of the 128-bit memory bus width and the relatively low 512MB of memory. Other reported benchmarks include Far Cry 2, Crysis WARHEAD, Call of Duty 5: World at War, Brothers in Arms: Hell’s Highway, and 3DMark Vantage.

MSRP for RV740-based graphics cards is expected to be $99 USD, which should make it a very attractive offering in the upper-mainstream segment. While the official name of the video card has not been released, the author of the article suspects it would be called the Radeon HD4750, which would be a logical name for a card with the given characteristics.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

new AMD Card
By frozentundra123456 on 2/25/2009 7:40:28 PM , Rating: 2
I wonder how much power this card uses. I dont really see a need for a card to fit between the 4830 and 4850 unless it saves a lot of power. I would have expected the first card at 40 nm to be a high performance one.

I really liked the simplicity of AMD's line up, basically 3 levels of cards with a lower and higher performance part in each level. Now their lineup is getting really complicated and confusing (or maybe its just me).

What did they do, hire nVidia's team in charge of naming cards?? At least they haven't yet stooped to bringing out previous generation hardware and giving model numbers to make it seem like new generation hardware (have they??).




RE: new AMD Card
By JSK on 2/25/2009 8:08:02 PM , Rating: 5
At least this is actually a "new" card and a "new" process unlike the G92 renaming debacle.

Nvidia changes stickers and AMD shrinks the die are you are trying to draw a funny comparison? This isn't even on the same level.

In all likelihood the 4830 will be phased out and maybe even the 4850 as well.


RE: new AMD Card
By PrinceGaz on 2/25/2009 9:20:10 PM , Rating: 4
quote:
I would have expected the first card at 40 nm to be a high performance one.


nVidia learned the very painful lesson from that approach back in the GeForce FX 5800 days, and ATI/AMD have always steered clear of repeating it.

Rather than use a smaller fab-process with the latest and greatest chip, it is best to use it on a low-mid range product until it is proven to have satisfactory yields on those chips, before introducing it at the high-end. Just because you have a new smaller fab process available does not necessarily make it better or faster than the chips which come off the older fabs at first. It takes time for the fab plant to become better at making parts in that process size, so for several months it is common for it to be more economical to produce equally fast or faster parts on the older fab facility.

It's best to start with the new relatively untested facility with low-end parts where there should still be lots of usable chips even with a lot of defects per die, before progressing to manufacturing the large high-end chips which require a much lower defect-rate per die for satisfactory yields.


RE: new AMD Card
By MikeMurphy on 2/26/2009 1:34:44 PM , Rating: 3
Having been an owner of a 4830 and 4850 I can say that my expectations are that the 4830 will still be a better buy for the performance-oriented purchaser than this new chip if prices reflect positioning of stock performance specs.

The RV770LE chips (4830) were binned for one of two reasons. They had defective SIMDs ...OR... they didn't clock well enough to be used as an RV770. From my research and experience the vast amount of these binned chips have defective SIMDs, which means that they clock on par with the RV770. You will find that over half of the RV770LEs clock from stock 575mhz to well over 735mhz which is a phenomenal free overclock. These chips thrive with voltage and can usually break 800mhz with a small bump in power.

Anyways, its one hell of a value chip and I suspect this new chip won't offer the same value when overclocking.


RE: new AMD Card
By coldpower27 on 2/25/2009 10:33:31 PM , Rating: 3
quote:
I would have expected the first card at 40 nm to be a high performance one.


Both ATi and nVidia got burned big time using those approaches, ATi on it's HD 2900 XT, and Nvidia on the Geforce FX 5800.

The HD 4750 makes sense, way back when Nvidia had a Geforce 6800 GS for the mid range, that was eventually replaced with the Geforce 7600 GT.

You want to ditch the wider memory interfaces to cut cost if at all possible. Hence why you get ATi doing the 256Bit only thing with the HD 4800 Series.

nVidia doesn't really need to make a new core for the mid range. The G92B is alright for now,

I wonder if a 1600 Shader Card with 64TMU is possible for 40nm... if they can double the core resource from 55nm to 40nm on the mid range card, why not the high end.


RE: new AMD Card
By Natfly on 2/26/2009 12:49:36 AM , Rating: 2
There are rumors that TSMC's 40nm process is quite borked and offers little to no power/performance gains compared to the 55nm process. I guess we'll see when the reviews eventually come out. If it's true then the only benefit of 40nm will be restricted to cost only.


RE: new AMD Card
By CyberHawk on 2/26/2009 6:55:47 AM , Rating: 4
It's just you ;)


RE: new AMD Card
By nafhan on 2/26/2009 8:07:56 AM , Rating: 2
The 4830 is partially defective 4850 with 160 stream processors turned off and a 256 bit memory interface on a 55nm process. This new card will be designed with 640 stream processors and have a 128 bit memory interface all on a 40nm process. If nothing else, this translates into huge cost savings for AMD and (hopefully) us.


Bandwidth Misconceptions
By shabodah on 2/25/2009 5:04:49 PM , Rating: 5
I've been fed up with hearing about 128-bit memory "causing" issues since the 7600GT. A 128 bit card running 3200mhz GDDR5 has 77% of the bandwidth of a 256 bit card running GDDR3 at 1800mhz. Yet the this new card is still faster than the older one because the core is faster. Furthermore, 512 bit cards with 512MB memory are running out of steam at high resolutions just as much as the 256 bit and 128 bit ones are. Let's all just get bandwidth straight and quit blaming it when it simply is not at fault.




RE: Bandwidth Misconceptions
By davekozy on 2/25/2009 6:02:36 PM , Rating: 3
The 512 bit cards GTX 280 and GTX 285 have 1GB of memory not 512MB. Cards with only 512MB, none of which have a 512 bit bus as far as I know, don't perform well at high resolutions because they don't have enough memory. Not sure how much bandwidth affects performance at high res compared to running out of memory.
128 bit is probably sufficient if you run the memory fast enough. 512 bit 2500MHz DDR3 on the 285 is as much bandwidth as 10000MHz DDR5 at 128 bit!


RE: Bandwidth Misconceptions
By shabodah on 2/25/2009 6:43:52 PM , Rating: 2
Look into some older generation highend cards or workstation cards. Regardless, you are agreeing with me. Even the 2GB GTX285 card is showing significant gains at high resolutions over the 1GB GTX285 cards, and thats with no other variables to account for.


RE: Bandwidth Misconceptions
By The0ne on 2/25/2009 7:35:48 PM , Rating: 2
Or can head over to Anandtech to read one of their latest articles on video card. Multi-GPU and 1Gig+ memory :D

http://www.anandtech.com/video/showdoc.aspx?i=3517


RE: Bandwidth Misconceptions
By oopyseohs on 2/25/2009 6:38:27 PM , Rating: 3
You're absolutely right. However, I suspect a similar card with 512MB of GDDR5 at 3200MHz pumping through a 256-bit bus would hold a pretty good advantage over this 128-bit one. Furthermore, that extra bandwidth would probably help to keep frames high at the larger resolutions. Certainly we shouldn't be "blaming" the bus width on this decrease, but you have to agree that it does have an effect.


RE: Bandwidth Misconceptions
By Alexstarfire on 2/25/2009 11:22:05 PM , Rating: 1
No..... just no.


RE: Bandwidth Misconceptions
By Darkskypoet on 2/26/2009 6:46:09 PM , Rating: 2
It doesn't matter the width of the memory channels in question if you have to hit main system memory because you run out of local Vid Memory! Ever hear thrashing hard disk sounds? You could have the worlds fastest 1gb of system memory... If you require 1.5gb your hitting the next slowest part of the chain. (ie. Hard drive thrash) PERIOD.

This chip is in place to replace the 4830, I bet you most 4830s out there are actually good chips that could be made into 4850s at least, if not 4870s. This chip is there to essentially stop them from having to give away good cores at such a deep discount, further the 40nm process with TSMC is leaky right now, so that's another reason why no point in going high end yet.

Go high volume, and thrash the ASP of Nvidia's upper mainstream. AMD did it to the 8800gt, 9800 gt, and 260. Why stop now? Very good performance, I'd imagine adequate overclocking, but more importantly CHEAP .

A Cheap as borscht 4830 replacement with an option to twin it with ddr3 for the 65$ range. why waste, or even mfg another 48xx core for a 4830-esque product, when you don't have to? There has been more then a few reviews pointing to the fact that the 4830 is great, if it were a little bit cheaper. Well, welcome to that part.


A Tight Fit
By jskirwin on 2/25/2009 4:49:55 PM , Rating: 3
quote:
Every single benchmark result published in the review places the RV740-based card right between the existing Radeon HD4830 and HD4850 graphics cards, also from ATI.


In most of the review's benchmarks there were about 6 FPS difference between the 4830 and 4850. The RV740 fits right in between the two - 3 FPS more than the 4830, 3 FPS less than the 4850.

Seems like a tight fit to me for a new product.




RE: A Tight Fit
By Motoman on 2/25/2009 4:52:16 PM , Rating: 2
...IIRC, a 4850 is going for around, what, $150 now? If this thing comes in at $100 or less, at a loss of 3FPS, then the 4850 (and probably the 4830) become useless anyway.


RE: A Tight Fit
By Lonyo on 2/25/2009 7:38:30 PM , Rating: 2
If indeed the HD4830 doesn't just get removed from the product line altogether.
Why continue to make more expensive cards that perform worse?
With (presumably) improving yields on the HD4850/70 core, the 30 shouldn't be as necessary for using up failed cores, and a cheaper to manufacture (as a whole product) RV740 makes sense as a total replacement.


RE: A Tight Fit
By MonkeyPaw on 2/26/2009 7:45:07 AM , Rating: 3
This card will be cheaper to produce, thus the best to market at $100. Don't forget the RV790 is also on its way, and it's a die shrink with modest improvements. Chances are the 4830, 4850, and 4870 will all go EOL once the (mildly) faster RV790 products launch. I don't know what the RV790 products might be called--perhaps we'll see a 4950 and 4970.


Engineering sample
By Jansen (blog) on 2/25/2009 4:55:47 PM , Rating: 2
Please keep in mind that this is a preproduction engineering sample. Final specs are likely to be different, and performance will go up with driver tweaks.




RE: Engineering sample
By Alpha4 on 2/25/2009 5:43:19 PM , Rating: 2
By any chance, can you point me towards a preproduction benchmark for an existing card for comparison? I don't doubt these numbers will get better, but I'm curious about how much better.


Power consumption
By RU482 on 2/25/09, Rating: 0
EDID corruption
By omnicronx on 2/25/09, Rating: -1
RE: EDID corruption
By Motoman on 2/25/2009 4:24:24 PM , Rating: 2
...at the possible risk of exposing my own ignorance, what is EDID? I have used/shipped an awful lot of ATI video cards since the 3xxx series hit, and haven't heard any complaints...


RE: EDID corruption
By bankerdude on 2/25/2009 4:29:08 PM , Rating: 2
EDID= Extended Display Identification Data


RE: EDID corruption
By Motoman on 2/25/2009 4:32:27 PM , Rating: 2
Ah. What is it that ATI is screwing up here, and is it something that everyone should be seeing?


RE: EDID corruption
By bankerdude on 2/25/2009 4:38:48 PM , Rating: 2
EDID is what a display uses to describe its identity and capabilities to a source, such as video formats, audio formats, lip-sync delays, etc. The source (e.g. ATI graphics card, DVD, STB, etc) can then select output (e.g. video formats) in accordance with what the display supports. Honestly I've seen my fair share of EDID corruption on Nvidia as well.


RE: EDID corruption
By Samus on 2/25/2009 4:43:19 PM , Rating: 2
What happens when EDID corruption occures?

BTW $100 for a card this powerful is pretty damn good, especially when it will likely sell for $80-$90.


RE: EDID corruption
By Clauzii on 2/25/2009 8:21:30 PM , Rating: 2
On my XP installation it shows in that I have to set Resolution and Refreshrate from the right-click desktop-properties. If I use CCC, all resolutions are at 60Hz. No big deal, if You know that other way around.

PS: A big applause for the ATI DNA-drivers. My AGP AH3650SILENT (ASUS) will finally run 3DMark05 and work with CCC. It's funny that the card only runs with the Sapphire Hotfix driver (not all things working..) or the DNA-brilliant-one. But NOT the driver from ASUS???.. oh well..


RE: EDID corruption
By omnicronx on 2/25/2009 4:33:58 PM , Rating: 2
Wiki:Extended display identification data (EDID) is a data structure provided by a computer display to describe its capabilities to a graphics card. It is what enables a modern personal computer to know what kind of monitor is connected. EDID is defined by a standard published by the Video Electronics Standards Association (VESA). The EDID includes manufacturer name, product type, phosphor or filter type, timings supported by the display, display size, luminance data and (for digital displays only) pixel mapping data.

In my case, (and many others) for whatever reason my ati card screws with the EDID information when connecting via HDMI to my receiver, and then to my TV. It essentially strips off the TV's information, and results in a black screen. VGA output is not effected, this will only happen when using DVI or HDMI. I had to manually override my display driver edid information to get it to display correctly.

You can see why this may be a problem for HTPC users.


RE: EDID corruption
By Motoman on 2/25/2009 4:35:54 PM , Rating: 2
...that is interesting, and would explain why I haven't heard of it yet because I don't know of anyone using their machines for HTPC type uses.

Perhaps ironically, though, I just ordered parts for my own self to build an HTPC for our use - including an HD4870 (I wants to play WoW on teh big screen FTW!!1!). Reckon I'll find out...


RE: EDID corruption
By omnicronx on 2/25/2009 4:47:07 PM , Rating: 2
quote:
Perhaps ironically, though, I just ordered parts for my own self to build an HTPC for our use - including an HD4870 (I wants to play WoW on teh big screen FTW!!1!). Reckon I'll find out...
Well if you have an issue, it can be retified although it can be a paintfull process.

http://www.avsforum.com/avs-vb/showthread.php?t=10...

This is the issue I had, I didn't get a blank screen, but I could only use certain resolutions and I got no sound through HDMI until I manually overrode my EDID information.


RE: EDID corruption
By omnicronx on 2/25/2009 5:16:11 PM , Rating: 2
I would also like to point out the fix only works for Vista/7. Anyone using XP will has been left in the dark.

Furthermore even with the fix I am limited to 48khz sound for whatever reason.


RE: EDID corruption
By Motoman on 2/25/2009 6:17:10 PM , Rating: 2
...well that looks like fun.

I guess I'll be wary when I put my own HTPC together. Hopefully won't run into any issue.

I wonder...is it a particular interaction between ATI cards and a particular brand/range of recivers? Applies to all video cards? Applies to all recivers?


RE: EDID corruption
By Motoman on 2/25/2009 4:36:50 PM , Rating: 2
A thought: isn't it possible that having the reciver in the middle is causing the issue? If for some reason the reciever isn't passing through this data correctly?


RE: EDID corruption
By omnicronx on 2/25/2009 4:44:23 PM , Rating: 2
quote:
A thought: isn't it possible that having the reciver in the middle is causing the issue?
In my case, without a doubt, it works perfectly if I attach it straight to the TV. This being said my receiver has no problem with any other device attached via hdmi (I have a 360, PS3 and HD-DVD player all using HDMI without issue). It is the video card that for whatever reason is incorrectly reading the EDID information.


RE: EDID corruption
By Motoman on 2/25/2009 4:49:42 PM , Rating: 3
...do we know for a fact that consoles and Hd-DVD players need/use EDID in the same way?

It just seems to me that if the video card works fine when connected directly to the TV, that in and of itself, it has no problem reading/using EDID...and that therefore it is most logical to presume that *something* the reciever is doing is the source of thoe issue...


RE: EDID corruption
By omnicronx on 2/25/2009 5:09:17 PM , Rating: 2
quote:
...do we know for a fact that consoles and Hd-DVD players need/use EDID in the same way?
They should, for example if I turn on my 360 and my receiver is not on, it will think that I am incapable of 5.1 surround sound and will default back to 2ch. If I have my amp on, it will read the edid information and 5.1 works without a hitch.

HDMI/HDCP handshake is definitely part of the problem, but there is no reason I should have to override this information in the software if the video card is not to blame. (keep in mind I have both my amp and TV on when I boot my computer, the issue of having one of the devices off when I boot my PC is not related to the issue I am having) HDMI works perfectly fine without the override until I boot into windows, thats what makes me think this is an Video card issue (and I am not alone in thinking this)


RE: EDID corruption
By Motoman on 2/25/2009 6:18:29 PM , Rating: 3
...don't suppose you have an Nvidia card laying around you could test with, using all the other same components? Would be an interesting exercise.


RE: EDID corruption
By Targon on 2/25/2009 4:58:11 PM , Rating: 2
It may be the DVI to HDMI connector supplied with the video card. I have heard(but not verified) that the one provided tends to be problematic, while others that can be ordered online will work a LOT better.


RE: EDID corruption
By omnicronx on 2/25/2009 5:10:43 PM , Rating: 2
This is also an issue, but in my case I don't use DVI at all. I have a physical HDMI port.


RE: EDID corruption
By SunAngel on 2/25/2009 7:49:43 PM , Rating: 2
I had a similiar problem with the HD2400XT (w/o a fan). My Sony XBR could not be read with the card. But, I then switched to a HD2600XT (with a fan) and the EDID info was read correctly.

Well let me back track for a moment. the 2400 would read as high as 1280x720 when connected but my tv is 1080p. Going back and looking at the spec for the 2400 it never explicitedly said 1080p. So, that when I decided to go up a notch to the 2600XT. Low and behold it worked, except now I have to deal with fan noise.


RE: EDID corruption
By William Gaatjes on 2/26/2009 10:48:02 AM , Rating: 2
When crt monitors where a big item and lcd monitors started to emerge, ddc appeared :

http://en.wikipedia.org/wiki/Display_Data_Channel

quote:
The Display Data Channel or DDC is a digital connection between a computer display and a graphics adapter that allows the display to communicate its specifications to the adapter. The standard was created by the Video Electronics Standards Association (VESA).


Seems to me edid is the renamed, improved and extended version of ddc.

http://en.wikipedia.org/wiki/Extended_display_iden...

quote:
Extended display identification data (EDID) is a data structure provided by a computer display to describe its capabilities to a graphics card. It is what enables a modern personal computer to know what kind of monitor is connected. EDID is defined by a standard published by the Video Electronics Standards Association (VESA). The EDID includes manufacturer name, product type, phosphor or filter type, timings supported by the display, display size, luminance data and (for digital displays only) pixel mapping data.


RE: EDID corruption
By Screwballl on 2/25/2009 4:32:09 PM , Rating: 2
Must be an issue on your side I have never had this problem with any video card ever (ATI, nvidia or others)...


RE: EDID corruption
By omnicronx on 2/25/2009 4:53:03 PM , Rating: 2
That's because it does no effect most users (as my other posts have explained), but with HTPC's becoming more prevalent, it most surely will very soon if the issue is not fixed. Video cards with the ability to pass through 7.1c LPCM essentially become useless for HTPC uses if you don't know what you are doing, and even then it doesn't always work.


RE: EDID corruption
By The0ne on 2/25/2009 7:32:16 PM , Rating: 2
The problem is out there to some users. Just need to check around the various technical forums to read about them. Sadly, I haven't own an ATI card since the 9800pro so can't comment :D Good luck with your search and hope ATI does address the problem soon.


RE: EDID corruption
By V3ctorPT on 2/26/09, Rating: 0
"Young lady, in this house we obey the laws of thermodynamics!" -- Homer Simpson











botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki