Print 127 comment(s) - last by DeepBlue1975.. on Nov 20 at 4:05 PM

Side-by-side details of AMD's RV670 and R600 processors.

RV670 will initially launch in two flavors: a high-end NVIDIA GeForce 8800 GTS competitor, and a mid-range-priced alternative that should dominate the sub-$200 market.

AMD reduced the size of its R600 core with RV670 by a significant amount.

  (Source: MadBox)
Two new RV670-based cards will make their way into AMD's lineup this week

An AMD presentation for its upcoming RV670 graphics processor, officially dubbed the Radeon HD 3800, was leaked out to media earlier this week. The slides confirm previous expectations regarding the next generation video card's technical features, such as DirectX 10.1 support, which DailyTech detailed last month.

These slides, confirmed by DailyTech for authenticity, tout DirectX 10.1 support. Before the end of the month, AMD will officially launch its ATI Radeon HD 3800 series. The launch will consist of two new video cards, the HD 3870 and HD 3850. The HD 3870 will be AMD's new enthusiast part, and is expected to launch at a retail price of $250, while the HD 3850 will for considerably less. DirectX 10.1, which is scheduled to ship with Windows Vista Service Pack 1, will be supported by both of these new cards.

The RV670 graphics core will be the industry's first 55nm GPU. It will feature more than 650 million transistors on 192 sq. mm package. By comparison, the ATI Radeon HD 2900 (R600) is manufactured on a 80nm processing node and has 700M transistors in 408 sq. mm. AMD claims that the change to a smaller processing node results in less power leakage and leads to the HD 3870 having less than half the power draw of the HD 2900 XT.

AMD will also introduce PowerPlay technology in its new processors. The HD 3800 series will feature an embedded power state controller that monitors the command buffer to assess GPU utilization levels. Through PowerPlay, engine and memory clocks along with voltage levels can all be dynamically adjusted automatically by the system.
AMD will also introduce ATI OverDrive technology with the HD 3800 line of cards. This will allow users to overclock their HD 3800 video cards through ATI Catalyst software. Users will also be allowed to overclock mutli-GPU setups.

ATI CrossfireX, previously dubbed Quad Crossfire, will also finally make its debut. In short, users will be able to connect up to four HD 3800 cards through AMD's Crossfire. The technology will support up to 8 monitors, and will also allow overclocking.

According to AMD guidance, the ATI Radeon HD 3870 will feature 512MB of 1.2 GHz GDDR4 memory on a 256-bit bus. The form factor of the HD 3870 will be dual slot. Despite its process node shrink, the card will still require a 6-pin PCIe connector. AMD measures peak board power at 105 Watts, and operating noise at 34 dBA.

The Radeon HD 3850, which will be a scaled back version of the RV670, will feature 256MB of 900MHz GDDR3 memory. Similar to the HD 3870, the HD 3850 will also feature a 256-bit memory bus. In addition, it will also require a 6-pin PCIe power connector, however, this time it will be based on a single slot form factor.

The Radeon HD 3870 and Radeon HD 3850 will both come with 320 stream processors, 16 texture units, and 16 render back-ends. The main difference between the two cards, though, will be their clock speeds.

The Radeon HD 3870 will come clocked at 775 MHz, while the Radeon HD 3850 will be clocked at lower 670MHz.

Diamond Multimedia previously leaked a GDDR4 version of the Radeon HD 3870, however, the company immediately retracted the leak from its website.

AMD partners tell DailyTech both cards will appear in the retail channel before the end of the month.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

256 bit bus
By lumbergeek on 11/12/2007 5:30:37 PM , Rating: 2
I hope that overcomes some of ATI's speed issues. I may give this one a serious look!

RE: 256 bit bus
By Parhel on 11/12/2007 5:44:38 PM , Rating: 2
If I'm reading it correctly, the second slide from the bottom looks like there is a very slight performance increase over the 2900xt. It looks to be under 5%. That's just 3dMark06, but as it's AMD's own slide I'm sure they would pick a benchmark that's favorable to the new architecture.

Even if the performance isn't quite there, it's definitely a step in the right direction. More performance, half the power, and half the price is quite a leap. We'll know for sure in a few days.

RE: 256 bit bus
By Goty on 11/12/07, Rating: 0
RE: 256 bit bus
By NullSubroutine on 11/13/2007 12:43:43 AM , Rating: 2
I wouldn't consider their changes minor. While not revolutionary; it is still evolutionary.

RE: 256 bit bus
By Parhel on 11/13/2007 1:43:22 PM , Rating: 1
I suppose I was using "new architecture" very loosely, but surely adding DirectX 10.1 is more than a minor change. I didn't mean to imply that it was built from the ground up. After all, people are referring to Penryn as a new architecture and that's comparable to this release.

RE: 256 bit bus
By scrapsma54 on 11/13/2007 2:08:57 PM , Rating: 2
Sounds to me like ati actually slapped the radeon 2900xt together to get things going. The bandwidth was more than the gpu could chew. Now the newer gpu's have something refined. Why else would they remove 256-bits from its pipeline and remove 150 million transistors. For them to release a new series quickly leads me to believe that 2900xt was a broken Gpu from the start.

RE: 256 bit bus
By Axbattler on 11/12/2007 6:00:02 PM , Rating: 2
I'd like to see them overcome the heat/power consumption/noise issue. I think it's an area they've been behind since the X1800XT. Together with the less than market leading performance of the X2900XT, they actually remind me of the Prescott.

RE: 256 bit bus
By FITCamaro on 11/12/2007 6:24:50 PM , Rating: 2
How would making the memory bus smaller help performance.

RE: 256 bit bus
By teldar on 11/12/2007 9:09:49 PM , Rating: 3
They had so much memory bandwidth they were not using, it's not helping performance, it's helping with cost and power requirements.


RE: 256 bit bus
By FITCamaro on 11/12/2007 11:06:27 PM , Rating: 3
Ok yes it'll cut cost and power, but that doesn't mean it'll improve speed as you first stated. I think this generation has proven that we still don't need more than a 256-bit bus.

RE: 256 bit bus
By Regs on 11/13/2007 12:37:58 PM , Rating: 2
I gander that the bandwidth is not being used because the video card simply is not equipped yet to address all that information from the memory in that big of chunk. Kind of like AMD64 vs Conroe, AMD has much more memory bandwidth but can't do nothing with it. Then AMD decided to actually address that problem with Barcelona by improving and increasing the size of the branch predictor and stack units.

Looks like ATi is taking baby steps first. I imagine later they'll be able to take advantage of the extra bandwidth when they tweak the GPU.

RE: 256 bit bus
By homerdog on 11/12/2007 9:11:59 PM , Rating: 2
That's what I was wondering. Maybe he's comparing to RV630?

RE: 256 bit bus
By jazkat on 11/12/07, Rating: 0
RE: 256 bit bus
By jazkat on 11/12/2007 10:17:22 PM , Rating: 2
and other tweaks

RE: 256 bit bus
By Proteusza on 11/13/2007 11:42:00 AM , Rating: 2
Unless I'm misinformed, AMD keeps its shader clocks the same as its core clocks? In other words, no change?

RE: 256 bit bus
By jazkat on 11/13/2007 2:19:27 PM , Rating: 2
im 99% sure the rv670 shader clock is double that of r600

maybe im wrong but im sure i read it somewhere

RE: 256 bit bus
By DingieM on 11/14/2007 6:19:00 AM , Rating: 2
I'm 100% sure shader clock in EVERY R6xx generation is the same as general internal clock.

RE: 256 bit bus
By jazkat on 11/13/2007 2:41:19 PM , Rating: 2
maybe you are rite, maybe its r700 that will have higher shader clocks, its what lets r600 and rv670 down nvidia's are clocked twice the speed.

RE: 256 bit bus
By Daidleus on 11/13/2007 10:21:18 AM , Rating: 2
Why on earth do these cards require the use of a 6 pin PCIe Power Connector? ATI's new cards don't require that much energy at all. My 8800 GTX requires quite a bit more. I have a 850 watt Toughpower Power supply that I purchased in March of 07. Unfortunately, it does not have the 6 pin PCIe power connectors. I was hoping I would be able to purchase an adapter that would allow me to use 2 PCIe 4 pin connectors to create a 150 watt 6 pin PCIe Power Connector, haven't seen anything like that yet though. I'm sure there are a lot of other people with the same problem. Not sure why ATI decided on this; just doesn't make any sense to me especially considering the position they are currently in. I certainly wouldn't want to have to spend money on a new power supply, especially one comparable to the one I have now in addition to a new video card. Thank goodness I don't need a new card, not yet anyway.

RE: 256 bit bus
By overzealot on 11/13/2007 11:14:54 AM , Rating: 2
Go to ebay and search for pcie adaptor. Or 6 pin adaptor. Or pcie power. Any of those searches will yield adaptors.
Sure, they're 2 x molex to PCIe 6 pin, but with an 850 watt PSU you should be able to spare that.

RE: 256 bit bus
By SavagePotato on 11/16/2007 9:33:40 AM , Rating: 2
There is no need. I haven't seen a card yet that didn't come equipped with one of those double molex adapters.

RE: 256 bit bus
By coldpower27 on 11/14/2007 10:54:35 AM , Rating: 2
You only need 2 4 Pin Molex Connector to make 1 6 pin PCI-E Connector from what I recall because the 4 Pin Connectors each give 45W's which is more then enough for a 6 Pin PCI-E which has 75W, you don't need a 8 Pin PCI-E for the RD HD3870 as that only needs 105W total which can be supplied easily from the PCI-E slot and 6 Pin Connector.

It's only the 8800 GTX or HD 2900 XT which are a pain as they require at least 2 6Pin PCI-E connectors.

Hold on...
By clovell on 11/12/07, Rating: 0
RE: Hold on...
By Lonyo on 11/12/2007 6:02:19 PM , Rating: 3
Might be half the power under typical use.
105W is just the peak, not typical.

RE: Hold on...
By AssassinX on 11/13/07, Rating: -1
RE: Hold on...
By Le Québécois on 11/13/2007 3:01:15 AM , Rating: 2
You seem to be confusing the diagrams on this slide.

The top right graph is for the power usage of the HD 2900XT(at 1 on the graph) versus the HD 3870(between 0.5 to te left and 0.325 to the right) which would indicate that the HD 3870 does indeed uses less power.

The middle bottom graph is for performance in 3DMark06 where it seems the HD 3870 outperform the HD 2900 XT by a slight margin.

RE: Hold on...
By AssassinX on 11/13/2007 4:32:24 AM , Rating: 2
Oh I see it. Did not notice the 3DMark06 sign. Forget what I said earlier.

RE: Hold on...
By ajfink on 11/12/2007 6:35:07 PM , Rating: 2
No, seeing as the 2900XT drains over twice that much.

This isn't some monster performance leap, but it's definitely a great step. I can't wait to see both benchmarks and power usage figures, seeing as these things are supposed to have new power-saving technologies onboard.

RE: Hold on...
By Regs on 11/12/2007 8:48:59 PM , Rating: 2
And you don't have to worry about upgrading to a 500watt PSU...maybe. I think that's AMD's goal. Just as long as it's silent, it does not have to be a performance beast though it does have to compete with a 8800GT at 200-220 dollars.

RE: Hold on...
By Mitch101 on 11/12/2007 6:57:18 PM , Rating: 2
The plus side is I wont need to buy a bigger power supply

Some early benchmarks show while it might not beat the 8800GT its at resolutions and framerates that the game is unplayable on either card at that resolution. I saw one for crysis where the 3870 was 2fps slower than the 8800gt but neither card was anywhere close to 30fps anyhow so who cares?

I would like to move to a PASS-FAIL for video resolutions.
For example one could argue 120FPS is 20% faster than 100FPS but both are playable at that resolution so they both pass. When it comes to one card getting 37FPS and the next one getting 33FPS depending on the person both may be playable framerates.

I prefer a 60FPS but others might accept lower. So to me if they both play at 60FPS+ at the same resolution that my monitor supports then Im going to get the quieter one, cheaper one, cooler one this is where the 3870 is going to do very well.

I also think my power supply might be able to handle 2 of these cards without needing to spend $65.00 on a new power supply.

RE: Hold on...
By Mitch101 on 11/12/2007 7:33:04 PM , Rating: 3
Here is a link

8800GT is faster than HD3870 by 2 FPS exact. (1280x1024, no AA)

From 3Dmark06 and Crysis, it looks like nVidia 8800GT is better than HD3870. However, there is one thing we need to tell you

We found ATi HD3870 is actually has better picture process while Crysis benching.

Specially the light refraction from water, HD3870 is very smooth but 8800GT is not. It is around frame 500 to 1000.

If you have chance, compare it. You will know what are we talking about.

The price for ATi HD3870 we are not sure yet, but it should be less than 8800GT.

Not only that, CF can do on P35, X38, 965, and 975 those motherboard as long as there are 2 PCI

RE: Hold on...
By Mitch101 on 11/12/2007 7:37:49 PM , Rating: 3
Lets throw in a price drop already too.

AMD has dropped its ATI Radeon HD 3800 series (RV670) price from US$199-259 to US$179-239 in preparation for its launch later this month,

RE: Hold on...
By mediahype on 11/13/07, Rating: 0
RE: Hold on...
By Mitch101 on 11/13/2007 9:39:48 AM , Rating: 2
You might want to take that back it seems NVIDIA might have tweeked the drivers to get a performance boost. I wouldnt have any objections to this but when it sacrifices video quality then its a problem for me because its marketing trying to one up the competition.

I don't know what nVidia do to 169.04 but we can see the MAX FPS drops from 30 to 20FPS and the max frame is same as ATi 3870.

Well. I guess only nVidia can answer it.

RE: Hold on...
By mediahype on 11/14/2007 1:42:19 AM , Rating: 2
well, if you read the paragraph just above it. And I quote here:

¨During the game at frame 500 - 1000, the quality of the light reflection becomes normal.
I thought the average of FPS will lower maybe .5 to 1FPS.


So changing the filename made the max FPS go lower but average FPS go higher. AND AND the light reflection becomes normal compared to ATIs so all claims of cheating can be put to rest.

I am disappointed by the number of people who rated my comments down because they do not like what I have to say. But the whole magic about the market is that consumers vote with their dollars. Now, not all consumers are smart enough to know what they are buying. But nevertheless, the whole market momentum cannot be changed by simple marketing tricks.

I am also more disappointed by the money behind the marketing. It is very obvious to people who are in the know that those latest paper launches by ATI are simply used and paid for by the same people shorting NVidia, led by that analist Doug Friedman.

RE: Hold on...
By mediahype on 11/14/2007 1:56:19 AM , Rating: 2
BTW, I just wanted to add this bit of information:

Judging from the numerous grammatical errors from that IAX Tech article, it is obvious that whoever wrote it does not reside in the United States. It is probably someone in Taiwan, perhaps one of ATI´s launch partners, who is either paid to write this article by Wallstreet analists like Doug Friedman(misspelled for your amusement) or is blatantly disregarding the NDA, since the Radeon HD3800 series cards are still not officially announced.

I am a reasonable and independent person. I would definitely love to see ATI release the HD3800 officially(not ALMOST), and have a reputable source like anandtech or techreport review it under absolutely no biases. Judging from the benchmarks I have seen, I already know the results: 10% slower than 8800GT under noAA, and about 15% slower under AA.

RE: Hold on...
By FITCamaro on 11/12/2007 11:15:47 PM , Rating: 3
Question. Why do the colors on the 3870 look so much more vibrant than the 8800GT? The 8800GT image looks all blurred while the 3870 image looks far more crisp.

Is it a difference of the frame where the 8800GT image is farther up in the air and the distortion is from clouds? I wouldn't think 30 frames would have changed the image so much.

The R670 definitely doesn't seem to take the performance hit the R600 does with AA and AF turned on though.

And 1280x1024 is nice but what about higher resolutions like 1680x1050? And what detail settings were those Crysis benchmarks taken? Must have been high or very high. Also was this DX9 or DX10? It doesn't say anything about the rest of the system.

RE: Hold on...
By Mitch101 on 11/13/2007 9:45:55 AM , Rating: 2
Different point in time screen shot. Look at the shore line the ATI picture is taken closer to the land the NVIDIA one is taken higher with some cloud cover.

For the 30FPS see my post above. Seems that NVIDIA might have tweeked the drivers.

Guess we will have to wait till NDA lifts to get more answers. I too am looking forward to 1680x1050 resolutions.

Also Im curious would 2 - 3850's perform faster than an 8800GT as the price might be close to buy 2 of them or a single 8800GT. Cant wait till Monday.

Would also like to see some older games run on these as well like Doom3, Half-Life 1, Original Battefield, Etc

RE: Hold on...
By s3th2000 on 11/12/2007 7:40:58 PM , Rating: 1
Personally, i would prefer to see some minimum frame rates taken into account. While its fine to have 60fps playing a shooter, you dont want it to drop to 10fps when the shit hits the fan ya know?
I would love to have 100FPS so that when a large gunfight happens with several people, it doesnt drop below 30fps because this is when the fps is most important. I believe hardocp does this using a graph.

RE: Hold on...
By SlyNine on 11/12/2007 9:17:44 PM , Rating: 2
I agree, Minumum FPS is the most important thing. What does it matter if the adverage is 60fps when every thing is on the line you are getting 10 FPS.

RE: Hold on...
By Parhel on 11/12/2007 10:15:58 PM , Rating: 1
HardOCP is for those who don't understand what benchmarks are for and why they exist. It's useless unless you own the exact same system that they benchmarked with. You're supposed to use benchmarks to extrapolate how a system you built would perform in a given scenario based. I would flat-out stop reading Anandtech's reviews if they ever did that.

RE: Hold on...
By akugami on 11/12/2007 11:09:16 PM , Rating: 1
Sorry and maybe I'm not understanding you but isn't all video card reviews exactly as you described? Basically they all choose a particular base system for their benchmarks and from that you extrapolate performance to your own system unless you own the exact same system. HardOCP does this and so does Anandtech.

I actually prefer HardOCP's benchmarks because they actually try to find out the minimum FPS in a game as well as peak FPS and average FPS. Let's say card 1 and card 2 both average 35 FPS. Both are about equal right? What if from a different test you see that card 1 has a minimum FPS of 25 while card 2 has a minimum FPS of 20 FPS. Which sounds better now? Sure, card 1 may not peak as high as card 2 but you are assured of smoother performance because it doesn't dip as low as card 1.

RE: Hold on...
By SlyNine on 11/13/2007 12:29:12 AM , Rating: 1
What does anything you said have to do with takeing minumum FPS in to account. You can figur that you will not see a huge change in FPS if you are running a decent CPU and jumping up to the best one. If you are gaming at any real resolution.

If their high end computer with a 8800GT was getting a min of 60FPS you can at least be certian that you will get 45-50 FPS min even if you're CPU is only 70% as fast/powerfull.

I really wish they would show ( and they used to for a short period) minumum FPS.

RE: Hold on...
By overzealot on 11/13/2007 11:22:39 AM , Rating: 2
Minimum FPS isn't a very good metric either, as it doesn't state how long the low framerate continued, how many times it occured, etc.
FPS vs Time graphs are the best for that info.

RE: Hold on...
By clovell on 11/13/2007 1:56:25 PM , Rating: 2
I was just curious - I didn't think a Graphics Card could draw 210 Watts. I wasn't bashing on the card at all. Seems some folks are feeling rather touchy today.

By munim on 11/12/2007 5:43:54 PM , Rating: 3
Any news on the AGP versions of these cards? My Asrock mobo and e6300 are itchingfor some next gen card goodness.

By guste on 11/12/2007 5:56:55 PM , Rating: 2
From what I've read over at Beyond3d, the AGP version will be out in December. I've also heard that some partners will be launching at the $199 price point, for the 3870.

I guess we'll know soon enough, seeing as it's launching on the 19th.

By tayhimself on 11/12/07, Rating: -1
By phusg on 11/13/2007 10:48:03 AM , Rating: 2
I should point out that recent AMD/ATI AGP drivers are currently far from stable (echoing the post from BSMonitor further in this thread). Hopefully that will change soon. See for the latest...

By EricMartello on 11/13/07, Rating: -1
By phusg on 11/13/2007 6:58:43 AM , Rating: 3
If you're going to bash AGP for no reason then at least be funny whilst doing it. An e6300 has more than enough horse power to drive these cards through the AGP bus, which itself has more than enough bandwidth.

By murphyslabrat on 11/13/2007 1:11:10 PM , Rating: 2
His point is that the industry has largely moved on, and that AGP products are now a novelty item. In reality, unless you have an insanely awesome motherboard, you can get a replacement mobo for only a little more than the price premium that gets tacked onto AGP cards.

By crimson117 on 11/13/2007 3:20:58 PM , Rating: 2
In reality, unless you have an insanely awesome motherboard, you can get a replacement mobo for only a little more than the price premium that gets tacked onto AGP cards.

More has changed than the gpu slot for many motherboards.

It actually has me quite disillusioned over the whole "upgrade path" bs.

I bought a socket 939 motherboard with DDR ram and an AGP slot (to match my agp 6800GT from summer 2004), along with a brand spankin' new x2 4400+ in mid-2005, right when the 4400 came out.

Now it's fall 2007. My video card is 3.5 years old, my system otherwise is 2.5 years old. I have to replace all of it if I want anything new. AGP is gone, DDR is gone, socket 939 is gone.

If I want to upgrade, it's not just a matter of getting PCIe motherboard. I have to buy a new processor and new ram as well. An AM2 4400+ and 2gb of DDR2 ram would run about $150, and would be equal performance with what I have today (and well below a modern core2duo setup). Then I'd need another $100 or so for an AM2 motherboard. I'd have spent $250 just to get new sockets, and would then still have to buy the new videocard!

By murphyslabrat on 11/13/2007 4:20:43 PM , Rating: 2
Yeah, I see your point. However, you can either buy a Geforce 7xxx or Radeon X1xx series card; or simply make a new computer and keep the old one. Some quick research resulted in the following

$85 for a 4400 x2
Or, you can get an Opty for the faster cache (the larger cache helps too, but is largely inconsequential), it'll be $80 for a Santa Anna 1210 w/ 1MBx2 L2 cache

$60 for 2x1GB DDR2 with an 86% positive feedback.
or $55 for a different pair, this one is a discounted $95 set.

Motherboard for $65, 690g, Saphire product. Some good OC reports for a uATX board. It doesn't have SLI, which you could get for an additional $10-$15.

$215 for a new Mobo/CPU/RAM combo, Add the price of a case (and PSU, which is included...) for $45 or $60
And you have yourself a brand new computer for $260-$280 +s&h and a new video card, so the new total is ~$500. I didn't include a monitor with that, as you probably could get a used monitor for less than $50 to use with your old computer.

BTW, anyone know what happened to all of the low-end Brisbane chips? Did all of the 65nm fabs get moved to Phenom chips?

By VirtualLarry on 11/14/2007 5:15:12 AM , Rating: 2
By murphyslabrat on 11/14/2007 1:33:35 PM , Rating: 2
But, the case I listed (they are the same basic design) has enough bays to get by, as I only know one person with more than 2 optical drives and three hard-drives in their PC. That person runs a server in his bedroom closet and has one of those four-foot-tall super-server cases. The advantage of the case I listed, is that it is actually about an inch shorter than most mini-tower cases, while still supporting full ATX boards. And, this is in addition to supporting up to three 120mm fans, albeit with a weird configuration.

By phusg on 11/14/2007 4:04:09 PM , Rating: 2
It actually has me quite disillusioned over the whole "upgrade path" bs.

I hear you! I bought a dual socket A motherboard with the whole upgrade path thing in mind. It's still running strong and software support is coming along nicely but unfortunately the AMD chipset couldn't cope with 'newer' AGP graphics cards, newer being within 2 years. I guess us humble consumers should be grateful our hardware manufacturer overlords are even bringing out AGP cards %-(

Hopefully all the upgrade your whole system addicted are re-using their old PCs in useful ways or disposing of them as chemical waste, which almost all consumer electronics is.

By EricMartello on 11/15/2007 9:20:42 AM , Rating: 2
I don't think there is anything funnier than someone still using AGP and publicly admitting to it. :D

By phusg on 11/16/2007 2:30:02 AM , Rating: 2

Ok now that made me LOL.

The most important thing
By ss284 on 11/12/2007 5:32:20 PM , Rating: 3
This card (3870) looks to be of similar performance to the x2900xt, which is often beat by the 8800GT. They better price it less than $239, especially after 8800GTs start dropping to their $200-250 MSRP

RE: The most important thing
By teldar on 11/12/2007 10:56:33 PM , Rating: 5
I can say that I'm really looking at the 3850 just because of power useage and UVD inclusion. I'm not a huge gamer and don't have time because of class, but the new AMD/ATI cards look like they should be the end all be all of HTPC cards. If you're not looking for top top top end video performance, and are looking for multimedia performance instead, these cards look pretty attractive....


RE: The most important thing
By phusg on 11/13/2007 6:05:08 AM , Rating: 3
Yes I'm thinking along similar lines, although if you want rock bottom power usage with UVD then the 2600 Pro would seem the more 'HTPC end all' card, with the more power efficient memory modules taking it down to around 30 Watts .

Power usage link:

I'm guessing the 3850 will be up around the 70 Watts mark at full 3D load. Although with the new Dynamic Power Management (DPM) these chips supposedly have it may actually use less whilst watching a HD movie (in some sort of UVD mode?).

DPM link:

RE: The most important thing
By therealnickdanger on 11/13/2007 8:06:48 AM , Rating: 2
AFAIK, AMD's UVD also incorporates VC-1 off-loading, unlike NVIDIA, making AMD the better HTPC card. I'm pretty sure I'll be picking up a 3850 and then overclocking it.

RE: The most important thing
By DingieM on 11/13/2007 8:35:40 AM , Rating: 2
Anyone has inside information what the overclocking potential can be?

I seriously want to see some benchmark figures if the HD3870 can be overclocked to 900MHz core. The R6xx generation performance ought to scale excellent with overclocking the core...

RE: The most important thing
By phusg on 11/13/2007 10:24:18 AM , Rating: 2
Does anyone know if UVD acceleration only works in combination with PowerDVD and physical disks?

666 million transistors
By RamarC on 11/12/2007 5:26:10 PM , Rating: 5
a "sign of the beast" graphics card. i wonder if it'll be able to handle hellgate:london, cb's jericho, or other "devilish" titles.

RE: 666 million transistors
By Zirconium on 11/12/2007 6:48:39 PM , Rating: 5
I noticed that too. Maybe AMD is trying to invoke the dark arts to help make their cards perform faster. Play Diablo at 666666666666666666666 FPS!!!!!

By Etern205 on 11/12/2007 8:36:51 PM , Rating: 2
I'm glad AMD decided to put a enthusiast card as a affordable price range ($250), because in the old school days when the 9700Pro reign the graphic card market, that was the normal price of a high-end card ($300 for 9700Pro $200 for 9500Pro). Not like today's ridiculous price of
$400-700 dollars.

RE: Price
By teldar on 11/12/2007 9:13:23 PM , Rating: 2
The enthusiast card will be the 3890? and will have two cores on it and I think have the full 512 bit ringbus. It's going to cost $400.


RE: Price
By murphyslabrat on 11/13/2007 1:36:00 PM , Rating: 2
While it will probably not happen this way, there is the possibility of a total of 8 GPU's: 4x Radeon HD38xx x2

Though, the only reason to do that that I can see would be for FLOP farming. nearly 4 teraFLOPS at ~500 watts; Get 250 units, and you have 1 petaFLOP, and all for 125 megawatts!
At an estimated system cost of $2,100 per unit, with an additional $1600 every 8 units ($200 per) for routing/terminal/rack-mount hardware, and you come out with $525,000 for the individual units, and an additional $50,000 in hardware. This may be a conservative estimate, as I have no clue on pricing for enterprise-level networking hardware.

With a total of $575,000 for a 1 PFLOP supercomputing cluster! The only downside is that this cost is only achieved using a enthusiast-consumer mobo, as opposed to a server mobo. But, 3 PFLOPS for $1,725,000 is a helluvalot cheaper than the ?2 Billion? for the latest Bluegene

RE: Price
By Savvin on 11/13/2007 3:21:17 AM , Rating: 2
Yeah, I'm glad to see both nVidia and AMD focusing on the mid-level market for a change. With the release of these cards and the 8800GT, prices should get pushed even lower so that those of us that can't afford to pay $350+ for graphics card can still get a great card to keep pace with the latest games. Kudos!

RE: Price
By Martimus on 11/13/2007 12:55:00 PM , Rating: 2
According to the last chart, the "enthusiast" card will be a 3870 X2, which did not have a price listed. It may be a single card with two RV670 chips on it a la the Voodoo 5500, or it may just be a designation for cross-fire. We'll probably find out more by the end of the week.

Worth the 3XXX name?
By 3kliksphilip on 11/12/2007 5:55:38 PM , Rating: 2
It's all very good that there's a wider bus, lower power requirements and less heat... but are these cards really that much more powerful than the X2900 series? From what I've seen hey can put up a fight against the Geforce 8800's, but so they should, having been released a year later. I've always considered the high end cards from each new series to be significantly faster than the last (I would give examples, but this is the only real exception to the rule that I can think of at the moment). Why not rename them the X2950's and release the X3800's and 900's when they actually have something significantly faster? Are they going to bother with the X3400's and X3600's, and if so, will there be any performance difference between them and the X2xxx equivilents? I could see a place for them in the mobile computing section if they're cool and efficient enough.

RE: Worth the 3XXX name?
By snarfbot on 11/12/2007 8:05:43 PM , Rating: 2
the bus is not wider, its only 256 bits while the 2900xt and pro were 512.

it also shows in the graph that the 3870 is about 5% faster in 3dmark06, while the 3850 is slower. this does not necessarily reflect real world performance though, but it seems about right as the 3870 would beat the 2900xt at all but the highest resolutions. the 3850 has a lower core clock, and half the bandwidth, thats whats really going to hurt it at higher resolutions.

are they going to bother with x3400's and x3600's?

probably as they are being manufactured on a smaller process and that would yeild more chips per wafer and make them more money.

RE: Worth the 3XXX name?
By NullSubroutine on 11/13/2007 12:47:50 AM , Rating: 2
They changed the entire naming scheme. At first I thought it was stupid they were going to 3000, but after seeing how they did it, I am glad they did. The scheme actually makes sense unlike alot of them that are out there.

No more XT GTS Ti GT XTX XTXXXTX GTX or whatever.

RE: Worth the 3XXX name?
By 3kliksphilip on 11/13/2007 12:55:22 PM , Rating: 1
I see what you mean, though I still think a 2950 brand is less misleading than claiming that it's an entire generation ahead of current products. (Though it doesn't stop them from releasing shoddy 2400's and 2600's which are less powerful than the previous generation's high end cards)

By ikkeman on 11/12/2007 7:09:58 PM , Rating: 2
Finally, an top (top-enough) end card that'll fit in my htpc - top titles on my 42" 1080p tv!!

A powerdraw of 100watt is less than current 2600xt/8600gts!

Definately part of the glorious future!

RE: whohoo
By toyota on 11/13/2007 1:18:07 AM , Rating: 2
no its NOT. the 8600gts is rated for 70 watts and uses much less than that in the real world. about 45-50 watts during full load is the norm. the 2600xt uses even less power than the 8600gts.

RE: whohoo
By phusg on 11/13/2007 7:00:45 AM , Rating: 2
According to XBitlabs they are actually both in the 45-50 Watt range, see

3870 X2?
By ponytrack on 11/13/2007 12:01:32 AM , Rating: 2
On one of the slides is says the 3870 x2, is that a dual core graphics or something of the sort (ala nvidia 7950GX2?)

RE: 3870 X2?
By Spivonious on 11/13/2007 8:40:39 AM , Rating: 1
I took that to mean two 3870s in Crossfire.

RE: 3870 X2?
By DingieM on 11/14/2007 6:23:22 AM , Rating: 2
No, 2 RV670 on 1 card.

By atimob3800 on 11/12/2007 6:37:04 PM , Rating: 2
Any word on when we can expect the mobility brand of these or even the price of the 3850?

Id like to see a lap top with 3850 mobility with entry level mobile penryn :D

RE: Mobility
By therealnickdanger on 11/13/2007 8:10:33 AM , Rating: 2

PCIE 2.0
By sitong666 on 11/13/2007 4:52:12 AM , Rating: 2
Is either of them PCIe 2.0? If so, do they need the 6 pin connector when they are in a PCIe 2.0 compatable motherboard?

RE: PCIE 2.0
By overzealot on 11/13/2007 11:06:33 AM , Rating: 2
1> Look at the slides. They're both PCIe 2.0.
2> My guess is "yes" but I'd be happy to be wrong.

Looks good on paper
By DeepBlue1975 on 11/13/2007 8:19:14 AM , Rating: 1
The problem is that this one will be pitted against the 8800gt, which is also selling for $250. If it can't score near the 8800gt, I doubt it will receive a good welcome from the gamers out there.

We'll just have to see how it fares once its out, and what does it deliver. If it consumes much less than the 8800gt and performance is not too far behind, I could be giving this one a serious look (very casual gamer here, mostly car race simulators)

RE: Looks good on paper
By DeepBlue1975 on 11/20/2007 4:05:28 PM , Rating: 2
Why did I get downrated?

Moot point anyway, what I've seen in reviews comparing both of them side by side, drove me to ditch my aging x800xl in favour of a brand new, factory overclocked xfx 8800gt

PCI Express x8 ?
By flukey on 11/13/2007 1:01:12 PM , Rating: 2

If you look at the "Current Bus Setting" item in the image it reads: "PCI Express 2.0 x8"

Doesn't that appear to be running at half the max speed (PCI Express 2.0 16x)?

Anyone have any comments?

RE: PCI Express x8 ?
By flukey on 11/13/2007 1:49:25 PM , Rating: 1
I e-mailed IAX-Tech, it's because the board is being used in CrossFire configuration. Therefore, with a X38 motherboard the benchmark results of the HD 3870 in CF mode should be higher...

Vista SP2?
By kostas213 on 11/12/2007 5:18:21 PM , Rating: 2
"DirectX 10.1, which is scheduled to ship with Windows Vista Service Pack 2, [...]"

You mean Service Pack 1 as I suppose.

512 MB
By thebrown13 on 11/12/07, Rating: 0
RE: 512 MB
By AnnihilatorX on 11/12/2007 8:07:36 PM , Rating: 2
These cards are meant to be mid-range not top-end.
There will be no high-end cards from ATI for a while

By AndreasM on 11/13/2007 2:54:40 AM , Rating: 2
The first thing that pops to mind is Valve's quietly forgotten gaming protocol. :P

By livingsacrifice on 11/13/2007 7:57:13 AM , Rating: 2
The major problem I see right now with my 2900XT is the fact of the drivers, so many games have come out since the last release and yet still no new drivers. I have heard numerous accounts of problems in Crysis Demo or Gears of War and yet ATI has still not released new drivers. Yes the power was a major issue for me when I installed my card it takes so much to power but the big thing for me is these new drivers. I still prefer ATI over Nvidia in terms of graphic quality even if the 8800gt looks better on FPS scores.

and operating noise at 34 dBA.
By PAPutzback on 11/13/2007 8:52:58 AM , Rating: 2
That is pretty freakin loud for an HTPC.

Hopefully some after market vendors can make a fan less version for the HTPC crowd.

By Martimus on 11/13/2007 9:34:17 AM , Rating: 2
Looking at the last slide, and remembering some earlier rumors, it looks like AMD is doing like 3dFX did before they were bought out and is putting two RV670 chips on a single board for their top of the line card. Of course that "X2" may just mean that they are forcing a crossfire setup to be their "enthusiast" solution.

By jipmac on 11/13/2007 2:00:31 PM , Rating: 2
Look what I found: Preliminary Test Radeon HD 3870 /w QX9650 !!!
check out page 3, 3D Mark 12961 with an HD3870 overclocked @ 860/1351 WOW!!


By JonnyDough on 11/13/2007 4:21:02 PM , Rating: 2
when do we get to start seeing green and silver colored cards?

Anti Aliasing and Filtering
By wingless on 11/13/2007 8:23:17 PM , Rating: 2
All I want to know is if ATI did some tweaks to enhance the AA and AF capabilities of this architecture. If not then its still a great value over the 2900XT and worth it just for the power savings, but the R600's performance drawback will still remain.

Crossfire OC'ing, eh?
By CrazyBernie on 11/12/2007 11:02:34 PM , Rating: 1
Does that mean companies will be selling matched pairs and quads?? ^_^

By johnsonx on 11/12/07, Rating: -1
RE: opposites
By tigen on 11/13/2007 1:22:57 AM , Rating: 2
Well it is new, in the sense that it isn't the same as the old. They changed the overall naming system also, so it would be more confusing to use a 2XXX name.

RE: opposites
By mediahype on 11/13/07, Rating: -1
RE: opposites
By bangmal on 11/13/2007 7:38:54 AM , Rating: 1
That is the most ignorant comment i have ever read.
why cant die-shrink be new? There are actually more transistors, much less heat and power consumption. Intel does the same thing with the Q9650 too. Read and Learn.

i would call your wife or your mom NEW after they having a breast surgery to go from small to extra large. That is a huge difference!

RE: opposites
By mediahype on 11/13/07, Rating: -1
RE: opposites
By Spivonious on 11/13/2007 8:39:15 AM , Rating: 2
It does have new features:

1. UVD
2. Ati Powerplay
3. DX10.1 support

In my mind this is equivalent to Intel adding SSE4 instructions, and bumping up the cache.

RE: opposites
By mediahype on 11/13/07, Rating: -1
RE: opposites
By DingieM on 11/13/2007 8:44:16 AM , Rating: 2
RV670 IS partly new with optimized hardware (seems to also have much less overhead with AA), and has (hardware implemented) DX10.1 which is more valuable than a card that doesn't have it: 8800GT lacks DX10.1 completely.
Next to that they are 55nm node and have all digital capacitors for on the fly overclocking.
AND RV670 has fully hardware VC-1 decoders which are used by most new HD films, and 8800GT does NOT have it. nVidia is hiding this from the press.

Read man.

8800GT isn't new, only optimized G80 die-shrinked to 65nm with OLD ANALOGUE capacitors. They cannot do 55nm yet.

This time, RV670 has much better heat dissipation than 8800 thanks to 55nm.

You just don't dare to admit, afraid ATI is getting back into the game or something?

RE: opposites
By mmntech on 11/13/2007 9:05:44 AM , Rating: 2
As a long time nVidia fan, I have to say these new offerings from ATI are pretty tempting. Looks like they're finally pumping out some sub $200 cards for the mid-range market that are looking half decent. The cards nVidia has in that price range are not much more than G70 cards with DX10 support. Nothing like the updates we saw from passed generations.

Of course, I'll make my final decision when they come out and we have some hardware reviews. I'm mostly interested in DX9 performance since I don't run Vista.

RE: opposites
By bfellow on 11/13/2007 9:24:54 AM , Rating: 2
By the time DX10.1 is being used the HD3800 series will be obsolete by newer ATI/Nvidia generation. Why are you talking about heat dissipation when you're overclocking?
That's like saying getting diet coke with your hamburger and fries.

RE: opposites
By mediahype on 11/13/07, Rating: -1
RE: opposites
By BSMonitor on 11/13/07, Rating: -1
RE: opposites
By AlphaVirus on 11/13/2007 11:31:48 AM , Rating: 2
AMD cares more about showing up Intel than making good video cards. AMD couldn't manage a chipset/motherboard business, what makes anyone think they can manage a video card business.

If I remember correctly, isnt AMD beating Intel in the chipset market. Especially when it comes to the gaming market.

Remember when we had to buy AMD for 64 bit support? So who's using their Athlon64 2600+ for Windows Vista 64-bit?

Uhm who is still using an Athlon64 2600+? I thought anyone who still has a single core at least has a 3200+ and majority people have already bought a dual and quad core.

RE: opposites
By maverick85wd on 11/13/2007 12:01:08 PM , Rating: 2
I thought anyone who still has a single core at least has a 3200+ and majority people have already bought a dual and quad core.

Athlon XP 2600+ ... and she still runs great! I don't really game that much because I never have time anymore, but for web browsing, running iTunes, and writing research papers :-/, my Athlon XP is wonderful. Do wish it was dual-core 64-bit sometimes though... will upgrade after second generation of high-K parts are available.

RE: opposites
By BSMonitor on 11/13/2007 12:10:57 PM , Rating: 2
If I remember correctly, isnt AMD beating Intel in the chipset market. Especially when it comes to the gaming market.

In what world do you live? AMD doesn't even manufacture its own chipsets or motherboards. Haven't since the days of the original Athlon. More than that, Intel is the leading graphics producer by volume. Why? Their integrated chipsets are sold to millions of business PCs. Hence Intel sells more chipsets than anyone!!! And that's just chipsets with integrated graphics. Besides, this post isn't about Intel. It's about AMD marketing fluff.

Uhm who is still using an Athlon64 2600+? I thought anyone who still has a single core at least has a 3200+ and majority people have already bought a dual and quad core.

My point exactly. Thank you. When Clawhammer came out, AMD marketed the hell out of it being 64-bit! Now no one who bought one then even uses the processor anymore, let alone for 64-bit Windows!!!!!

Now, let's market DirectX 10.1 even though no one will be using these cards when games actually start using DX10.1.

RE: opposites
By phusg on 11/13/2007 12:25:12 PM , Rating: 2
I liked your post about the drivers BSMonitor, but I'm going to have to call BS on this one ;-)

majority people have already bought a dual and quad core
Now no one who bought one then [Clawhammer] even uses the processor anymore

What planet do you guys live on? Even if you are just talking about computer enthusiasts (a minority) there are a lot of us still using Athlons! Being a computer enthusiast is not about dropping cash on the latest and greatest hardware every quarter , although this is obviously what every tech companies marketing department would have us believe.

And if you think I'm only speaking for myself check out these statistics: Of the million people using Valve's online gaming system steam 75% have single core processors.

RE: opposites
By murphyslabrat on 11/13/2007 1:04:50 PM , Rating: 2
Now, let's market DirectX 10.1 even though no one will be using these cards when games actually start using DX10.1.

Did you know that the Radeon 8500 is powerful enough to play Oblivion on low settings? However, the only way to do so is to use a third party program (Oldblivion) to force shader model 1.1. So what if no one is purchasing new 3850's to play "Super-Mondo-Resource-Hog 2", but those who purchased a 3850 now can still play it, albeit on low settings.

FYI, I know two people who are running Windows Vista Home Basic x64, and another that is running Windows XP x64, all on an Athlon 64 3200, 3400, and 3000, respectively. I, however, am caught wishing I had paid extra for the Athlon 64 3000, instead of settling for an Athlon XP 2800.

Then, of course, one of them was running 64-bit Ubuntu from the time he got his 3200, and rendering in Blender at almost twice the speed. Windows is not a necessity here, especially for early adopters. For him, the new dx10.1 class hardware will be fully usable as soon as OpenGL catches up. And, for rendering, framerate is not as important as flexibility.

RE: opposites
By mars777 on 11/13/2007 7:52:17 PM , Rating: 2
For him, the new dx10.1 class hardware will be fully usable as soon as OpenGL catches up.

OpenGL doesn't have to catch up.
It's and open standard where every developer can write their own OpenGL library that uses which hardware parts they like in whatever GPU they wish.

New versions of OpenGL incorporate changes done by people and that becomes a new version of OpenGL..

ID Software and Epic are some of those guys...

RE: opposites
By InsidiousAngel on 11/13/2007 11:04:32 AM , Rating: 3
why dont you go bang your mom? That was the least professional response I can expect from a low life like yourself.

You are attacking someone’s professionalism with a 5th grader response?

Another ATi paper-launch ?
By kilkennycat on 11/12/07, Rating: -1
RE: Another ATi paper-launch ?
By teldar on 11/12/2007 10:51:19 PM , Rating: 2
I was just looking on Tom's new site and they said nVidia has shipped 40000 8800 gt's worldwide. AMD has shipped 250000 3850's and 3870's already and will ship another 150000 before the end of the year. It kinds sounds like there will be supply problems for nVidia for a while where the 8800 GT's are concerned.


RE: Another ATi paper-launch ?
By kilkennycat on 11/12/2007 11:35:21 PM , Rating: 1
Saw that already. I suspect Tom's sources are as reliable as they always are. theInquirer is frequently quoted :-) :-)

If ATi can pull off a genuine hard-launch, it will be a welcome change from their paper-launches of the past 3 years. However, if they have been forced to drop the price to compete with nVidia, how are they going to pay off their huge development and startup costs?? Seems as if the ATi board partners might effectively be shipping out a gold bar with each unit, courtesy of ATi charging them less that the true economic price for each GPU. So, 250,000 gold bars -- nice deal for AMD/ATi... NOT !

nV made ~$250 million PROFIT in the last quarter. AMD/ATi had ~ $400 million loss. If there is a price-war, guess who can easily survive? And Intel is there to crimp AMD's recovery of development costs on their CPUs. They fully mean to continue squeezing AMD. The AMD Phenom launch can expect a very quick reply indeed from Intel. No problem for Intel to pull in desktop Penryn ship-dates and fully match prices, should they choose to give AMD a very unhappy set of Christmas presents.

More sidestepping...
By shreddR on 11/12/07, Rating: -1
RE: More sidestepping...
By jazkat on 11/12/2007 10:46:55 PM , Rating: 1
lol these are mid range cards, we get high end in 2008

im skipping this lot and waiting for the g100 and r700, il never buy the first designs when they are just finding there feet.

next gen should be much more pleasable ( well hopefully )

RE: More sidestepping...
By Makaveli on 11/14/2007 12:13:42 PM , Rating: 2
RE: More sidestepping...
By DeepBlue1975 on 11/16/2007 12:27:30 PM , Rating: 2
Nice link, thanks.

No advantage in benchmarks over the 8800gt, and the advantage in consumption and heat produced is only noticeable while idling.

Anyway, a moot point as yesterday I got myself an Nvidia 8800gt... So there it goes my aging ATi x800xl. Served well, but wouldn't let me play RB Rally Dirt and that kinda pissed me off.

After almost 5 years of exclusively AMD + ATi setups (AMD + Matrox before that), now I'm running on an Intel + Nvidia machine... In fact, this is my first Intel CPU after the 80486 dx4 I had some time ago.

I hope I can go back to AMD / ATi some time in the future, but well... By now I found Intel's and Nvidia's products to be better bang for the buck than AMD's / ATI's ones, so my money is there (I don't get married to any brand / manufacturer, even though I used to hate Intel a lot and looked for alternatives first :D )

"Mac OS X is like living in a farmhouse in the country with no locks, and Windows is living in a house with bars on the windows in the bad part of town." -- Charlie Miller
Related Articles

Copyright 2015 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki