Print 47 comment(s) - last by Darkskypoet.. on Feb 26 at 6:46 PM

RV740  (Source: Guru of 3D)

  (Source: Guru of 3D)
Benchmarks place it comfortably between the HD4830 and HD4850

Despite an uncharacteristically tight-lipped stance from ATI on its first 40nm parts, hardware review website Guru of 3D has managed to secure a sample of an RV740-based graphics card and put it through the paces. At first glance, the specifications suggest that it is little more than a shrunken RV770LE core, but a closer look reveals several improvements

According to the preview published today, the GPU boasts 640 shaders, 32 TMUs, and 16 ROPs; all the same as the RV770LE. However, the core and memory clocks both receive a significant bump, up from 575MHz to 650MHz and 1800MHz GDDR3 to 3200MHz GDDR5 respectively. The substantial boost in memory frequency works to offset the performance difference incurred by the smaller 128-bit memory bus on the RV740, which brings the reported math processing rate for the GPU up to 900 GigaFLOPS.

Every single benchmark result published in the review places the RV740-based card right between the existing Radeon HD4830 and HD4850 graphics cards, also from ATI. For example, at 8xMSAA and 16xAF in the popular first person shooter Left 4 Dead, the RV740 turns in 28 FPS, flanked by scores of 25 and 30FPS from the two cards at 2560x1600 resolution. The card does exhibit a significant performance drop at higher resolutions, likely the result of the 128-bit memory bus width and the relatively low 512MB of memory. Other reported benchmarks include Far Cry 2, Crysis WARHEAD, Call of Duty 5: World at War, Brothers in Arms: Hell’s Highway, and 3DMark Vantage.

MSRP for RV740-based graphics cards is expected to be $99 USD, which should make it a very attractive offering in the upper-mainstream segment. While the official name of the video card has not been released, the author of the article suspects it would be called the Radeon HD4750, which would be a logical name for a card with the given characteristics.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

RE: EDID corruption
By omnicronx on 2/25/2009 5:09:17 PM , Rating: 2
quote: we know for a fact that consoles and Hd-DVD players need/use EDID in the same way?
They should, for example if I turn on my 360 and my receiver is not on, it will think that I am incapable of 5.1 surround sound and will default back to 2ch. If I have my amp on, it will read the edid information and 5.1 works without a hitch.

HDMI/HDCP handshake is definitely part of the problem, but there is no reason I should have to override this information in the software if the video card is not to blame. (keep in mind I have both my amp and TV on when I boot my computer, the issue of having one of the devices off when I boot my PC is not related to the issue I am having) HDMI works perfectly fine without the override until I boot into windows, thats what makes me think this is an Video card issue (and I am not alone in thinking this)

RE: EDID corruption
By Motoman on 2/25/2009 6:18:29 PM , Rating: 3
...don't suppose you have an Nvidia card laying around you could test with, using all the other same components? Would be an interesting exercise.

"Intel is investing heavily (think gazillions of dollars and bazillions of engineering man hours) in resources to create an Intel host controllers spec in order to speed time to market of the USB 3.0 technology." -- Intel blogger Nick Knupffer

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki