Print 47 comment(s) - last by Darkskypoet.. on Feb 26 at 6:46 PM

RV740  (Source: Guru of 3D)

  (Source: Guru of 3D)
Benchmarks place it comfortably between the HD4830 and HD4850

Despite an uncharacteristically tight-lipped stance from ATI on its first 40nm parts, hardware review website Guru of 3D has managed to secure a sample of an RV740-based graphics card and put it through the paces. At first glance, the specifications suggest that it is little more than a shrunken RV770LE core, but a closer look reveals several improvements

According to the preview published today, the GPU boasts 640 shaders, 32 TMUs, and 16 ROPs; all the same as the RV770LE. However, the core and memory clocks both receive a significant bump, up from 575MHz to 650MHz and 1800MHz GDDR3 to 3200MHz GDDR5 respectively. The substantial boost in memory frequency works to offset the performance difference incurred by the smaller 128-bit memory bus on the RV740, which brings the reported math processing rate for the GPU up to 900 GigaFLOPS.

Every single benchmark result published in the review places the RV740-based card right between the existing Radeon HD4830 and HD4850 graphics cards, also from ATI. For example, at 8xMSAA and 16xAF in the popular first person shooter Left 4 Dead, the RV740 turns in 28 FPS, flanked by scores of 25 and 30FPS from the two cards at 2560x1600 resolution. The card does exhibit a significant performance drop at higher resolutions, likely the result of the 128-bit memory bus width and the relatively low 512MB of memory. Other reported benchmarks include Far Cry 2, Crysis WARHEAD, Call of Duty 5: World at War, Brothers in Arms: Hell’s Highway, and 3DMark Vantage.

MSRP for RV740-based graphics cards is expected to be $99 USD, which should make it a very attractive offering in the upper-mainstream segment. While the official name of the video card has not been released, the author of the article suspects it would be called the Radeon HD4750, which would be a logical name for a card with the given characteristics.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

RE: EDID corruption
By Targon on 2/25/2009 4:58:11 PM , Rating: 2
It may be the DVI to HDMI connector supplied with the video card. I have heard(but not verified) that the one provided tends to be problematic, while others that can be ordered online will work a LOT better.

RE: EDID corruption
By omnicronx on 2/25/2009 5:10:43 PM , Rating: 2
This is also an issue, but in my case I don't use DVI at all. I have a physical HDMI port.

RE: EDID corruption
By SunAngel on 2/25/2009 7:49:43 PM , Rating: 2
I had a similiar problem with the HD2400XT (w/o a fan). My Sony XBR could not be read with the card. But, I then switched to a HD2600XT (with a fan) and the EDID info was read correctly.

Well let me back track for a moment. the 2400 would read as high as 1280x720 when connected but my tv is 1080p. Going back and looking at the spec for the 2400 it never explicitedly said 1080p. So, that when I decided to go up a notch to the 2600XT. Low and behold it worked, except now I have to deal with fan noise.

"Death Is Very Likely The Single Best Invention Of Life" -- Steve Jobs

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki