backtop


Print 50 comment(s) - last by carage.. on Mar 20 at 10:12 AM

AMD packs next-generation AVIVO high-definition video decoding features into its value and mainstream lineup

AMD’s next-generation value and mainstream products are set to bring DirectX 10 and high-definition video playback to the masses. Although AMD is late to the DirectX 10 game, the upcoming RV610 and RV630 feature second-generation unified shaders with shader model 4.0 support. AMD has remained hush over the amount of unified shaders and shader clock speeds of its next-generation value and mainstream products though.

AMD is prepared to take on NVIDIA’s PureVideo HD with its next-generation AVIVO video processing. AVIVO is receiving its first upgrade since its introduction with the Radeon X1k-series with the RV610 and RV630. This time around, AMD is integrating its Universal Video Decoder, or UVD, for hardware decoding of H.264 and VC-1 high-definition video formats.

AMD’s UVD expands on the previous generation’s AVIVO implementation to include hardware bit stream processing and entropy decode functions. Hardware acceleration of frequency transform, pixel prediction and deblocking functions remain supported, as with the first generation AVIVO processing. AMD’s Advanced Video Processor, or AVP, has also made the cut for low power video processing.

Integrated HDMI with support for HDCP joins the next-generation AVIVO video processing for protected high-definition video playback. Unlike current HDMI implementations on PCIe graphics cards, RV610 and RV630 integrate audio functionality into the GPU. Instead of passing a PCM or Dolby Digital signal from onboard audio or a sound card, RV610 and RV630-based graphics cards can directly output audio – removing the need of a separate sound card.

RV610 and RV630 support PCIe 2.0 for increased bandwidth. Native support for CrossFire remains, as with current ATI Radeon X1650 XT and X1950 Pro products. AMD will also debut RV610 and RV630 on a 65nm manufacturing processor for low-power consumption. Expect RV610 products to consume around 25 to 35-watts. RV630 requires more power at around 75 to 128-watts.

AMD currently has four RV610 reference designs based on two RV610 variants – Antelope FH, Antelope LP, Falcon FH and Falcon LP reference boards and RV610LE and RV610PRO GPUs. Antelope FH and Antelope LP are similar; however, Antelope LP is the low-profile variant. Both reference boards feature 128MB or 256MB of DDR2 video memory clocked at 400 MHz. Antelope boards employ the RV610LE, feature passive cooling and consume less than 25-watts of power.

AMD’s Falcon LP reference board is another low-profile model with 256MB of GDDR3 memory clocked at 700 MHz. Falcon LP takes advantage of a DMS-59 connector for dual video outputs while maintaining a low profile form factor. The Falcon LP reference board employs active cooling to cool the RV610LE or RV610PRO GPU.

AMD Antelope FH, Antelope LP and Falcon LP only support software CrossFire – all lack support for the CrossFire bridge connectorHKEPC confirmed this CrossFire setup in a recent report last week.

The Falcon FH reference board is the performance variant and designed for the RV610PRO ASIC with 256MB of GDDR3 video memory. AMD estimates board power consumption at approximately 35-watts, though it is unknown if Falcon FH boards will feature active or passive cooling. Falcon FH is the only RV610 reference board to support AMD’s CrossFire bridge connector for hardware CrossFire support. Falcon FH also features VIVO capabilities.

RV630 has three reference board configurations – Kohinoor, Orloff and Sefadu. Kohinoor is the high-performance RV630 variant and features 256MB or 512MB of GDDR4 memory. It also features VIVO and dual dual-link DVI outputs. However, it consumes the most power out of the three RV630 reference boards, requiring 121-watts for 256MB models and 128-watts for 512MB models.

Orloff falls in the middle with 256MB of GDDR3 video memory. Orloff lacks the video input features of Kohinoor but supports HDMI output. AMD estimates Orloff to consume less than 93-watts of power. Kohinoor and Orloff support PCIe 2.0 and native CrossFire. Kohinoor and Orloff require additional power via PCIe power connector though.

Sefadu falls at the bottom of the RV630 lineup and features 256MB or 512MB of DDR2 video memory. HDMI remains supported, as with Orloff though. Power consumption is estimated at less than 75-watts, and does not require the additional power supplied by a PCIe power connector. All RV630 boards feature 128-bit memory interfaces and occupy a  single-slot.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Darn
By tkSteveFOX on 3/13/2007 3:10:04 AM , Rating: 2
I was hoping for a 256bit interface in the mainstream GPU`s.This 128bit bus is a pain in the ***.And so a Mainstream card will have 128bit bus and a high end one a 512bit one.Thats hardly fair at all.The performance differance will be huge if that`s the case.




RE: Darn
By otispunkmeyer on 3/13/2007 3:57:25 AM , Rating: 4
im with you here

nvidia have pushed the boat out to 384bit on the high end, AMD are likely to push that even further to 512bit with their highend, yet the mid-range cards get no such increases. 128bit bus is just soooo old skool now its a joke...we should of moved on. even if its just to a 192bit bus.

i think the reasoning behind it though is that its cheaper to purchase super fast 1Ghz GDDR3 (2Ghz effective) and 1ghz+ GDDR4 than it is to build a wider bus equipped with run of the mill GDDR3 700-800Mhz stuff.

8600GTS will have about the same bandwidth as the 6800GT did when it came out and the same as 7800GT so mid-range bandwidth is about that of the last 2 gens high end.


RE: Darn
By otispunkmeyer on 3/13/07, Rating: 0
RE: Darn
By sbanjac on 3/13/07, Rating: -1
RE: Darn
By FITCamaro on 3/13/2007 6:20:19 AM , Rating: 2
You can get a 7600GT these days for $100. A 7900GT or X1950Pro for $200.


RE: Darn
By crimson117 on 3/13/2007 9:39:39 AM , Rating: 2
Actually you can get an x1950pro 256MB for as low as $150 today after rebates. Best bang for the buck on the market right now, imho.


RE: Darn
By TechLuster on 3/13/2007 2:54:28 PM , Rating: 2
I agree--that's a great deal. But I think you can do even better. I just picked up an EVGA 7900GS KO (500/1380) for $140 with rebates. Sure, at stock speed the x1950 pro is faster, but with the factory OC (and with Anandtech's review model reaching an ADDITIONAL 20% overclock), I think this may barely edge out the ATI card.

But in any case, the real question is how these two cards are going to compare to G84 and RV630. At $150, the later cards are only expected to be sporting 1.4GHz GDDR3 (same as the x1950pro and 7900GS) but with only a 128 bit connection. This fact is the reason I felt comfortable upgrading now, as opposed to waiting. (I'll be sticking XP for awhile, so DX10 doesn't matter to me.)


RE: Darn
By shabby on 3/13/07, Rating: 0
RE: Darn
By Russell on 3/13/2007 1:48:35 PM , Rating: 2
It's a card designed for HTPC's and such, not for gaming systems. 64-bit is fine for that application.


RE: Darn
By Flunk on 3/15/2007 7:10:47 AM , Rating: 2
Would you use a low end card for gameing now? (x1300, cough). Then why would you think a new low-end card would be decent for gaming?


RE: Darn
By carage on 3/20/2007 10:12:04 AM , Rating: 2
Unfortunately I do...
I was duped into buying a slim Dell desktop.
When I first received it I was amazed by its size.
Now I only agonize for not choosing its larger cousin.
Looks like I am going to be stuck with the low-profile 1300 Pro for some time.
I have already used it to play Supreme Commander, NBA Live 07, and the C&C3 Demo.
NBA Live 07 actually looks decent. C&C3 doesn't look bad.
Though for Supreme Commander, I have to set to low details.
Last week I spent a considerable amount of time browsing Shanghai's computer malls, looking for another low profile video card to replace it.
Unfortunately, I could not find a single card higher than the one I am currently using. I know there should low-profile 7600GS available, I even showed the website to the store clerks, but no luck. Probably just another paper launch product.


RE: Darn
By R3MF on 3/13/07, Rating: 0
RE: Darn
By saratoga on 3/13/2007 12:01:29 PM , Rating: 2
Pins are really, really expensive. Adding more pins means you need a big die (more transistors). If they can get away with a 128 bit bus, they're going to do it. I don't know what it's like for AMD or Nvidia's parts, but in general theres going to be a hard limit on how small you can make a die and still have enough room for 256 data pins. Which means that certain segments of the market are always going to be 128 bit.


RE: Darn
By TechLuster on 3/13/2007 3:21:20 PM , Rating: 2
I understand what you're saying, but consider the following:

Assuming G84 has 64 shaders as expected and assuming it had 2/3 the ROP's of the GTX (as opposed 1/3 if it uses a 128-bit mem interface as expected--more on this later), then it should end up around 300-400 million transistors. On an 80nm process, this will result in roughly the same die size as the 256-bit G71 (7900). Furthemore, the X1950 PRO has 330 million on an 80nm process with a 256-bit interface.

So if they can sell us 7900's and X1950's with 256-bit connections for around $150 now (my $140 EVGA 7900GS KO arrived yesterday), why can't they sell us a ~$225 G84/RV630 with 256-bit memory interfaces? I think this is exactly what enthusiasts on a budget have been waiting for.

In the case of G84, a 128-bit interface with the GeForce 8 architecture implies these cards will only have 1/3 the ROP's of the 8800GTX. Hence, the first wave of midrange cards from Nvidia will be crippled in two ways (both memory and ROP power). Hence, I believe G84 is just a stopgap until they can roll out midrange cards on 65nm using GDDR4. This will allow them to increase both core and mem clocks, making up for the lack of ROP power and bus width. They're not giving us more hardware in the meantime so that they can maintain pin-combatability with the 6600/7600GT's.

(Of course, we all know what they really should have given us: a 384MB 192-bit midrange card. How perfect would that have been?)


RE: Darn
By scrapsma54 on 3/13/2007 7:02:11 PM , Rating: 2
Your over looking the fact that Gddr4 makes up for the low memory interface. In fact your overlooking how much wattage it consumes, how much it costs, and how much performance it packs per watt. As long as it shadows the performance a 8800gtx within 5 frames has for under 150watt consumption and has a much cheaper price (why shouldn't it since it uses 65nm manufacturing process) I gots my money on it. also, 128bit may seem low in comparison to the g80's, but realize that 128-bit has been around for 6 gpu generations and 256-bit was introduced much later in the GeForce6000 series.


RE: Darn
By InsaneScientist on 3/13/2007 9:44:05 PM , Rating: 2
If they were going to keep the memory speeds the same as the current gen stuff, I'd be right there complaining with you, however....

Consider for a second, what matters is not truly the bus width, what matters is the memory data rate. While increasing the bus width is certainly one of the fastest ways to increase data throughput and increase it quite a lot, it's also expensive. Going from 128-bit to 256-bit increases the cost of manufacturing by an incredible amount. (I don't remember the figures, but I think you're talking about a couple more layers on the PCB) The more economical solution, if possible, is to first ramp up the clock speed on the memory, and only then increase the bus width.

We know that GDDR3 is capable of going considerably faster than it's clocked on current midrange cards (IIRC they've gotten GDDR3 up to 800MHz, which would be a 1600MHz effective data rate - far beyond current midrange cards).
And then once we exhaust the potential GDDR3 has to offer, we have another more economical solution before we go to 256-bit: we simply swap out the GDDR3 chips for GDDR4 , which we've already seen break 1GHz (2GHz effective data rate) and GDDR4 is still growing...

And look at the chart: one of the midrange cards does exactly that: it's equipped with GDDR4.

The more something costs them to make, the more it will cost us as consumers. It's better for us if they can increase the bandwidth without increasing the bus width, because otherwise it would cost us a lot more.
As long as they can increase the bandwidth, it doesn't matter how they do it.


RE: Darn
By bargetee5 on 3/15/2007 10:13:19 PM , Rating: 2
Gddr 4 halves the amount of interface needed to achieve the same amount of performance a 512-bit interface needs. Since gddr 4 requires less wattage and carries double the bits per transmission. Equals an effective 92.34GB/s, much higher than the data rate of the Geforce 8800gtx which ironically uses a 384-bit interface.


RE: Darn
By InsaneScientist on 3/15/2007 11:03:39 PM , Rating: 2
What are you talking about?

The only way you can halve the width of the bus and keep the performance the same is if the memory on the narrower bus is running at double the clock speed of the other.
While GDDR4 does allow for higher speeds, there is nothing inherent about the technology that allows it to go faster.

It's like the transition from DDR to DDR2 on the desktop. Assumming that they are both running in dual channel, DDR running at 400MHz will have the exact same bandwidth (6.4GB/s) as DDR2 running at 400MHz.
Now, the latencies on the DDR2 will be higher, but that's a different category.
Granted DDR2 can hit 800MHz, and therefore achieve that 6.4GB/s with half the bus width, but the tradeoff is that the speed must be doubled to do that.


RE: Darn
By Zoomer on 3/17/2007 9:16:20 AM , Rating: 2
He probably thinking GDDR4 = QDR. Sorry, I would like that too, but this isn't it.


Wait... PCIe 2.0?!
By Snowy on 3/13/07, Rating: 0
RE: Wait... PCIe 2.0?!
By FrankM on 3/13/2007 10:35:01 AM , Rating: 2
1.) No, not yet. Bearlake is expected to be the first with PCIe 2.0
2.) Yes, you can. PCIe 2.0 cards are backwards compatible with normal PCIe slots; you just won't get the additional features (higher bandwidth and power).


RE: Wait... PCIe 2.0?!
By raven3x7 on 3/13/07, Rating: 0
RE: Wait... PCIe 2.0?!
By sdsdv10 on 3/13/2007 1:43:15 PM , Rating: 2
quote:
That is good to know. thanks! I was afraid we were going to have another AGP to PCIE debacle.


Please explain how you feel the AGP to PCIe change was a debacle. I would like to know you thoughts on a better way to replace one hardware standard with another. This process is always difficult and I don't see any simple way to make it easier. At some point, you just have to make the switch. Same thing with DDR to DDR2. In most cases the benefits of the new technology outway the cost and difficulties associated with the change over.


RE: Wait... PCIe 2.0?!
By Polynikes on 3/13/07, Rating: 0
RE: Wait... PCIe 2.0?!
By othercents on 3/13/2007 3:12:34 PM , Rating: 2
I purchased my first AGP system in 98 and upgraded the video card and/or motherboard almost every year. This have given me almost 8 years worth of use along with about 6 different video cards. PCI Express was initially created in 2004, but we didn't see mass production of the standard until 2005. I didn't purchase a PCI Express machine until mid year 2006.

Based on the video tests that I have seen, video cards haven't see much of a performance boost with PCI Express compared to the AGP counterparts. However the bus speed with PCI Express far exceeds AGP and will be more apparent with the newer video cards.

If you compare AGP to your standard PCI bus you will notice a major difference in speed especially with the last 8x versions of the bus. This is one of the reasons why I could never say that AGP was a debacle. It had a very good purpose when it was created and another technology became a better choice. As technology moves forward you will find other technology that does the same (IE. DVI vs HDMI).

Other


RE: Wait... PCIe 2.0?!
By D4rr3n on 3/13/2007 4:40:24 PM , Rating: 2
quote:
Based on the video tests that I have seen, video cards haven't see much of a performance boost with PCI Express compared to the AGP counterparts. However the bus speed with PCI Express far exceeds AGP and will be more apparent with the newer video cards.

If you compare AGP to your standard PCI bus you will notice a major difference in speed especially with the last 8x versions of the bus.


Yes, exactly right, the only advantage performance wise PCIe brings is the ability to connect multiple video cards together. PCIe at any speed rating will give you ZERO performance increase over AGP8x versions of the same card. Not only that, 4x and 8x AGP do not perform any differently either (so therefore 2 cards with the same exact specs one on PCIex16 and the other on 4xAGP should perform equally as well). This is simply due to how the system works and the transfer speed of the agp or pcie connection not being a bottleneck. This will continue to be true as long as the memory on the graphics card and your system memory is adequate. The transfer speed only comes into play when all the the information loaded to your video cards memory (mostly large textures and other stuff at the load of a level) has completely filled it and it needs to request a large amount of information or a bunch of small transfers back and forth repeatedly MID GAME. If that doesn't happen you will never, EVER notice a difference. With the memory on some of todays cards and future cards that should never be experienced.


RE: Wait... PCIe 2.0?!
By D4rr3n on 3/13/2007 4:11:02 PM , Rating: 2
Well this is my take on it (and I'm not giving you a hard time or ragging on your or anything like that), if you want to stay on the cutting edge you are going to be replacing lots of equipment quite often. There is simply nothing you can do....other than accepting your decision to either stay on the cutting edge or live with what you have. I'm not sure if you're saying you want it to last 1.5 or 3 years? If you expect 3 expect to be disappointed as you watch technology pass you by. You may get 1.5 and you may not, it really depends on your timing and luck.

Just some stuff of the top of my head for an example of recent changes: ddr to ddr2 to ddr3 soon was very rapid, AMD changed sockets fast enough to make your head spin I believe at least 6 sockets in the past 3.5 years with another 2 by the middle of next year, the move from ata to sata and now sata 3gb (with Intel going sata only on some boards) and esata and sas trying to creep in as well, the rapidly increasing power requirements and connector types that may have required a new PSU, of course the agp to pcie switch and many more things I'm sure I'm forgetting. Unfortunately it is just the nature of the beast.

I certainly understand where you are coming from though. My biggest complaint is the cost of AGP cards ESPECIALLY at retail. I understand part of pcie being made the standard quite quickly was because it also cut down on manufacturing costs but when has that EVER affected the pricing of premium hardware? Generally better performance = higher price, worse performance = lower price. I have 2 older systems networked here and I don't use them much but I was looking to get a new agp card for one of them and the pricing is just assbackwards.

But anyways if you managed to get 3 years out of your system you should be more than satisfied, IMHO that is a long time. I probably average a new build once a year though it'll vary on which new tech is needed or available. I've gone as low as 6 months between full builds and if I was lucky I got 1.5 years out of the system. It's not as expensive as you think, I can easily turn around a pc to a coworker or friend who isn't technically inclined and will pay a premium. And of course there is always ebay where people will buy anything. Just sell your equipment you plan to replace before any serious depreciation occurs and you shouldn't have many complaints.


RE: Wait... PCIe 2.0?!
By bob4432 on 3/14/2007 11:09:14 PM , Rating: 2
i wouldn't say esata and sas trying to creep in, they are for different uses - esata - obviously external sata - good for external hdds but not much better than 1394a/b or usb2 unless you use the hell out of the drive in which case esata=sata.

sas=servers - using 2.5" 10K sas hdds give datacenters a very high amount of storage in GB per area used (not sure how they measure - i rack units, or whatever) so that is where that is coming in. but at least they are making sas controllers able to control sata drives :)


RE: Wait... PCIe 2.0?!
By Anonymous Freak on 3/13/2007 10:01:40 PM , Rating: 2
No, it's much more like AGP 2x to AGP 4x.


Pretty cool...
By Cybercat on 3/13/2007 6:02:35 AM , Rating: 2
So when you buy one of these cards, you're basically buying a sound card with it? Factor in Havoc FX accelerated physics, and you've got the quintessential all-in-one "gaming board" people have speculated about.

Also, to the above post, it's not all about fillrates and 3DMark scores. For the most part, mainstream cards have nearly doubled in performance generation to generation. Part of that is raw bandwidth and fillrates, the other part is efficiency. Each new architecture improves the utilization and performance of many situations. And it's always been a better deal to buy last gen highend cards for the same price as current mainstream cards. If you want more performance, buy the former, if you want more features, buy the latter. But it's important to have mainstream DX10 cards on the market.




RE: Pretty cool...
By clemedia on 3/13/2007 8:28:59 AM , Rating: 2
I suspect the card will have a couple pins on it to transfer sound to it much like you do with the front of the case headphone jack. It will just import the sound and pass it down the HDMI cable, poor wording in the article me thinks.


RE: Pretty cool...
By Spivonious on 3/13/2007 10:31:45 AM , Rating: 2
quote:
RV610 and RV630 integrate audio functionality into the GPU. Instead of passing a PCM or Dolby Digital signal from onboard audio or a sound card, RV610 and RV630-based graphics cards can directly output audio – removing the need of a separate sound card.


RE: Pretty cool...
By clemedia on 3/13/07, Rating: 0
RE: Pretty cool...
By darkpaw on 3/14/2007 9:22:38 AM , Rating: 2
Why would that be so hard to believe? Not like it costs much for an audio decoder. They don't need any hardware to process/output the audio since thats all done by the device recieving the HDMI signal.


RE: Pretty cool...
By Scrogneugneu on 3/13/07, Rating: 0
RE: Pretty cool...
By Flunk on 3/15/2007 7:34:19 AM , Rating: 2
Watch out for the sarcasm police.


AMD/Nvidia
By skarbd on 3/13/2007 6:05:53 AM , Rating: 2
I don't see AMD have much to worry about, just provide "stable" drivers which actually work well and they will be on a winner.

I have found that the Nvidia Purevideo drivers enhance nothing, and worsen imaging when I have switched them on with Windvd 7 & 8, so not alot for AMD to worry about there.

I don't think its unreasonable thing to ask for when people are shelling out alot of money for these products. Its like buying a Porsche and then forgetting to provide an engine management system.




RE: AMD/Nvidia
By Regs on 3/13/2007 8:31:58 AM , Rating: 1
Drivers? Their first objective is to actually release a new product. I feel like im playing my games on a my imginary K8L Stars AND rv630.


RE: AMD/Nvidia
By D4rr3n on 3/13/2007 9:54:18 AM , Rating: 2
Well that is pretty much spot on. Now I like AMD as much as anybody else, for years I only built AMD systems, and I had always hoped they'd catch up with Intel. But I just don't understand the total and complete dismissal of any of the problems AMD is currently facing.

Perfect example right here, AMD has nothing to worry about but drivers because someone can't get purevideo setup properly with Windvd. I mean could anybody honestly think that? They certainly have a hell off a lot to worry about and drivers isn't at the top of that list.

They are in the worst position both financially and as far as competitive products out in the market in a long time, a very long time. I hope they can bounce back from this because it certainly isn't good for the consumer to have less competition in the marketplace, but they are in one hell of a jam right now. People can close their eyes and plug their ears all they want but it wont make this situation go away.


RE: AMD/Nvidia
By Zoomer on 3/17/2007 9:20:53 AM , Rating: 2
Well, there aren't actually any real DX10 boards on the market. The all loved G80 doesn't support DX10 in vista right now.

Drivers are important. If not, why don't you switch to VESA SVGA drivers for your card right now? That has gotta be well tested.


Wait for the benchmarks.
By Mitch101 on 3/13/2007 2:21:08 PM , Rating: 2
Im certainly going to wait for the benchmarks before passing judgement on what they should have done and I am happy that their will be something to compete with NVIDIA to hopefully reduce prices on DX10 hardware.

For the DVR or VI capable chipset I can only hope its supported with Freevo or MythTV. I dont see myself using Vista Ultimate for PVR functions because of the $400 pricetag and might just continue with XP Media Center Edition Of course then I would lose the DX10 functionality.

My past experience with ATI PVR functionality has been poor on many levels. I wish Microsoft would make the PVR functionality a seperate add on for something like $50.00 to Vista instead of a $400.00 OS. Oh well time will tell.




RE: Wait for the benchmarks.
By jkresh on 3/13/2007 3:24:13 PM , Rating: 2
ATI MMC was good in the past but hasnt kept up and when they went to cyberlink (for some of their tunners) it was a debacle. As to vista, home premium has Media center you dont need ulimate for it (and if you have business and need the home premium features a realtime upgrade to ultimate isint that bad)


Graphic has errors
By Trisped on 3/13/2007 4:12:23 PM , Rating: 2
For the 630 top line card it says memory is just GDDR and for the bottom line 630 card it says memory is DDR2. I guess that is an AMD mistake though, as they are the ones that usually make the slides.

I like that they have the power usage. They should post that with the higher line of gamer cards, not just the desktop/multimedia cards. It would be nice to have speeds of the GPU and RAM also.

All in all the article was very well written and insightful.




RE: Graphic has errors
By Anh Huynh on 3/13/2007 5:20:43 PM , Rating: 2
Fixed, thanks for pointing that out.

It's actually GDDR4.


External?!
By Scabies on 3/14/2007 9:46:02 AM , Rating: 2
Whats up why did no one else notice that the RV630's Kohlnoor and Orloff are listed as "Power Type: External"
hellooo doesnt this solve the power usage debate about the 600 series being a freaking electric pig?




RE: External?!
By Warren21 on 3/14/2007 2:35:08 PM , Rating: 2
I think they mean that it requires a PCIe power connector and internal (in this chart) refers to power over the PCIe bus.


128-bit
By Egglick on 3/13/2007 11:30:41 AM , Rating: 2
The 128-bit memory interface is just plain lousy. Seeing as how at least one of the 8600 cards will be 256-bit, RV630 has no chance to compete unless the memory clock is jaw-dropping.

This is nothing new though.....ATI has been neglecting their midrange since the 9600 series.




Great for Media Center PC
By Micronite on 3/13/2007 11:35:14 AM , Rating: 2
This will be great as a Media Center video card. I try to keep my Media Center and Gaming rig separate. This will enable me to spend less money on a video card and still be able to get some good quality out of it.




Low power
By typo101 on 3/13/2007 5:34:33 PM , Rating: 2
quote:
AMD will also debut RV610 and RV630 on a 65nm manufacturing processor for low-power consumption. Expect RV610 products to consume around 25 to 35-watts.


I like this. This sounds like it will run nice and cool. Is the 65nm a result of the AMD buyout, or are they just paying attention to customer's plea for cooler running graphics cards?




Windows Experience Index Rating
By flurazepam on 3/14/2007 12:20:08 PM , Rating: 2
According to charts Kohinorr and Orloff both have a Windows Experience Index Rating of ">6". Interesting, considering my two overclocked 8800GTX's only have an index rating of 5.9! I'm assuming these guesstemations will be part of the new updated rating system.




"Spreading the rumors, it's very easy because the people who write about Apple want that story, and you can claim its credible because you spoke to someone at Apple." -- Investment guru Jim Cramer

Related Articles
The Future of HDMI
February 19, 2007, 12:47 AM
ATI's Holiday GPU Lineup Unveiled
August 28, 2006, 3:05 PM
More ATI RV550 Details
July 31, 2006, 1:31 PM
ATI CrossFire Bridge Sighted
June 11, 2006, 12:31 PM













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki