backtop


Print 32 comment(s) - last by rushfan2006.. on Jul 3 at 9:51 AM


HIS X1600 Gemini Courtesy of HKEPC
Single-board dual RV570 boards in the pipeline

ATI’s upcoming RV560 and RV570 will find its way into single-board, dual-GPU CrossFire configurations. Since the GPU cores have integrated composite engines, the cards are capable of Crossfire support directly on the card. 

ATI is recommending to its AIB partners that the new 80nm mainstream and value parts will be joined together on a single PCB with a PLX Technology ExpressLane PEX 8532 PCI Express bridge chip. This bridge chip is currently used on ATI Gemini graphics cards such as the GeCube dual X1600 graphics card. The bridge chip itself is nearly the same size as the GPU but only draws approximately 7.38 watts. Preliminary boards show the PEX 8532 heatsink-less which isn’t surprising considering its low power draw.

On Gemini graphics boards the PEX 8532 bridge chip (PDF) takes one PCI Express x16 interface and divides the bandwidth in half. This allocates eight PCI Express lanes to each GPU similar to how some motherboards divide sixteen PCI Express lanes across two slots on Intel, lower end SLI and CrossFire motherboards. As the PEX 8532 is a generic PCI Express switch it can be used for other implementations besides graphics switching also.

DailyTech has learned Hightech Information Systems (HIS) has dual RV570 products in the pipeline using the PEX 8352 bridge chip. Although RV570 won’t necessarily offer as much horsepower as a Radeon X1900XT, Gemini variants will be able to dedicate one RV570 GPU to physics processing; essentially one GPU can be used for graphics while the second can be used for physics or graphics.  Additionally, up to four DVI outputs can be used per card with the help of twin DMS59 interfaces

ATI’s 80nm RV560 and RV570 are expected to arrive in August and September.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

...
By shabby on 6/25/2006 1:48:29 PM , Rating: 2
Whats the point of using midrange gpu's? If that would be a r580 on that board then i'd be impressed.




RE: ...
By modestninja on 6/25/2006 1:51:42 PM , Rating: 2
I would guess it keeps the power draw and hence heat down. Also they cost a whole lot less to manufacture.


RE: ...
By fxyefx on 6/25/2006 2:26:46 PM , Rating: 2
I agree. I don't understand why ATI is releasing dual-chip cards using midrange GPUs when the end performance will be on par or less than higher end single GPUs. Wouldn't that end up costing ATI more, especially with the development costs? Why not just work on tweaking the power efficiency of the higher end GPUs... (ATI does a great job with this in their mobile solutions.) Crossfire without the dongle is nice, though.


RE: ...
By masher2 (blog) on 6/25/2006 2:48:49 PM , Rating: 2
> "I don't understand why ATI is releasing dual-chip cards using midrange GPUs when the end performance will be on par or less than higher end single GPUs"

A) Same performance from slower clocked chips = higher net yields.

B) Lower power draw

C) Development costs can be amortized against dual-chip solutions utilizing higher-clocked GPUs.


RE: ...
By fxyefx on 6/25/2006 2:59:26 PM , Rating: 2
I understand that they'll get better yields with the lower clocked chips, but shouldn't it still cost them more considering that they have to use double the number of GPUs to get the performance that could be had for one?

ATI already has a lot of methods by which they could achieve lower power draw; they just haven't implemented them in their desktop solutions, for some reason.

Does ATI even have plans for dual-chip cards using higher-end GPUs? If they do, I just hope they don't end up with nvidia-like solutions (massive increase in power draw for insignificant performance increases.)


RE: ...
By masher2 (blog) on 6/25/2006 3:40:31 PM , Rating: 2
> "shouldn't it still cost them more considering that they have to use double the number of GPUs to get the performance that could be had for one? "

It all depends on their yield curve. There's another factor as well...the sales of the midrange-clocked parts. If they're having trouble selling all those chips, they can either drop the price further, cutting margins...or they can "convert" two midrange chips into a top end performer...through the use of one of these dual-gpu boards.


RE: ...
By yyrkoon on 6/25/2006 9:05:34 PM , Rating: 2
Not everyone is a gamer, some people actually work for a living, and use thier desktop as a workstation. Midrange cards with up to 4 outputs would come in very handy . . .


RE: ...
By rushfan2006 on 7/3/2006 9:51:38 AM , Rating: 2
quote:
Not everyone is a gamer, some people actually work for a living, and use thier desktop as a workstation. Midrange cards with up to 4 outputs would come in very handy . . .


Your post is a bit puzzling to me...(full disclosure: Yes I'm a gamer), your statement implies that people that are gamers don't work for a living? Its like you are looking down on gamers ("some people ACTUALLY work for a living") as some kind of lazy social rejects or something.

I assure you I work quite hard, as do all my gamer friends...how do you think we afford these rigs? LOL


RE: ...
By killerroach on 6/25/2006 4:30:00 PM , Rating: 2
The reason for midrange GPUs is simple... have you ever heard the cooling system for the X1900XT? Heck, that thing's practically Dustbuster Junior (remember the days of the GeForce 5800 Ultra?). If you had two X1900XTs on the same PCB, you'd either be talking water cooling or a small jet turbine to keep the silicon from bubbling over (the former being simply extreme and the latter being large and noisy).


RE: ...
By Wwhat on 6/25/2006 7:42:22 PM , Rating: 2
but the x1900aiw can do with a much smaller cooler, simply by lowering the clocks a bit.
SO two 1900's at a more modest clock could compete with nvidia's two 7900's with a slower clock thingy.


RE: ...
By NextGenGamer2005 on 6/25/2006 6:41:08 PM , Rating: 3
RV570 may be mid-range, but it is still a beast. This GPU will have 12 pixel pipelines (along with 12 TMUs, 12 ROPs), a total of 36 pixel shader units, and a 256-bit memory interface. The core clockspeed is rumored to be 625MHz; so, it will actually offer better performance then the Radeon X1900 GT (which has a 575MHz core clockspeed).

The RV560 is the exact same, but with a 128-bit memory interface. The core clockspeed will probably be 450MHz to 550MHz.


Hopefully this isn't another rushjob...
By Rock Hydra on 6/25/06, Rating: 0
What's old is new...
By dilz on 6/25/2006 2:21:18 PM , Rating: 2
ATI has experience with dual-CPU implementation, a la http://www.anandtech.com/showdoc.aspx?i=1098&p=2.

The Rage Fury Pro was a decent card when it came out, but it was plagued with driver problems. The current iteration of SLI/Crossfire of technology should be able to resolve the issues that vid card makers faced in the past when making dual-cpu solutions.


RE: What's old is new...
By Rock Hydra on 6/25/2006 3:33:19 PM , Rating: 2
I know someone who had one of those, but generally if things are rushed, they're not that good. I would think that one would want to bring something that actually competes to the market. But, we'll have to see when benchmarks come out.


RE: What's old is new...
By wakeboardin on 6/26/2006 12:56:34 AM , Rating: 2
it's isn't just time spent it's experince it takes time and refinment to get good at something. Like SLI/Crossfire and nvidia's dual gpu boards.


RE: What's old is new...
By dilz on 6/26/2006 1:22:53 AM , Rating: 2
The design was revolutionary for the time, but I think too much bandwidth was lost to rendering overhead. As I recall, each GPU was responsible for rendering every other frame. The "load balancing" between the GPU's led to lackluster performance compared with what was viewed as the technology's potential.

Current multi-GPU solutions offer a variety of rendering scenarios to mitigate the effects of unfavorable performance in certain situations. Anyway, I'm happy to see ATI develop a single card solution because it will inevitably be cheaper to produce/purchase than the competition is offering at the moment.


That's a big a** board...
By rpsgc on 6/25/2006 2:41:13 PM , Rating: 2
And they said the 7900GX2 was big :D




RE: That's a big a** board...
By slashbinslashbash on 6/25/2006 2:51:14 PM , Rating: 2
Yeah, it's huge! I've got a full-tower case and my AIW X1900 is already nearly touching my hard drives. I don't think this thing would fit unless I removed one of the HD trays.


RE: That's a big a** board...
By PrinceGaz on 6/25/2006 6:18:09 PM , Rating: 2
It's not as long as the Voodoo 5 6000 would have been if it had been launched. Then again, most people thought the V5 6000 was rather big too.


Whats with the brige chip
By wakeboardin on 6/26/2006 1:03:10 AM , Rating: 2
I rather have 2 cards running in x16 mode I belive the new nvidia brigde chip is basicly a switch. Do graphic cards take that big of a hit from being in 8x mode anyways i saw a comparison between a agp and pci-e card and they proformed within 1 fps on all games. How much of the x16 are graphic cards using?




RE: Whats with the brige chip
By Master Kenobi (blog) on 6/26/2006 9:08:26 AM , Rating: 2
Not too much really. The amount of bandwidth graphics cards use of the x16 is not that much really. They have a lot of unused space on the lanes, which is good because that gives the cards room to grow before we have to replace it with x32 slots or something heh........


RE: Whats with the brige chip
By Lonyo on 6/26/2006 9:23:26 AM , Rating: 2
Plus they are mid range cards, so they would need lass bandwidth than a high end card anyway.


Confused...
By clementlim on 6/25/2006 2:58:51 PM , Rating: 2
1. Single card tat features dual midrange gpu, that is not capable of competing with X1900XT???

2. If that is the case, the argument is 1 gpu for graphics and the other for physics???

3. Does the motherboard need to be Crossfire-ready since the crossfire chip/bridge is already integrated into the card???

4. It shares the bandwidth of the PCI Express x16, meaning each gpu gets x8 only??? Or does this apply only to the Gemini board, not the new dual gpus???




RE: Confused...
By KristopherKubicki (blog) on 6/25/2006 4:09:29 PM , Rating: 3
Hi,

quote:
1. Single card tat features dual midrange gpu, that is not capable of competing with X1900XT???

That's not really determinable yet, although likely.

quote:
2. If that is the case, the argument is 1 gpu for graphics and the other for physics???

Yes that is specifically what the target market for this card seems to be when it starts using the RV560/RV570 ASIC.

quote:
3. Does the motherboard need to be Crossfire-ready since the crossfire chip/bridge is already integrated into the card???

Right now, yes.

quote:
4. It shares the bandwidth of the PCI Express x16, meaning each gpu gets x8 only??? Or does this apply only to the Gemini board, not the new dual gpus???

Gemini is just a name for the project ATI has for putting two GPUs on a single board. RV560 and RV570 are not dual core, they are just ASIC replacements for existing ATI chips. But yes, there is only x8 lanes per GPU, if that even really matters. Mid-range GPUs, even high-end GPUs, do not have many cases where they can push the bandwidth of the full x16 bus.


Memory
By epsilonparadox on 6/26/2006 12:39:52 PM , Rating: 2
Is the memory on the card shared between the processors or separate banks?




RE: Memory
By Myrandex on 6/26/2006 1:07:27 PM , Rating: 2
separate.


A True Single-Card Solution
By Ulfhednar on 6/25/06, Rating: 0
RE: A True Single-Card Solution
By PT2006 on 6/25/06, Rating: 0
RE: A True Single-Card Solution
By The Cheeba on 6/25/06, Rating: 0
RE: A True Single-Card Solution
By wuZheng on 6/25/2006 2:45:59 PM , Rating: 3
There was a time people used to make logical, well-thought out arguments before being fanboys too... Before you wrote YOUR comment, maybe you should consider the fact that it does matter to some?

Yes, it doesn't matter to me if its two PCBs, on many levels:

1) And most importantly, it takes up case space, and therefore inherently blocks the airflow coming from front side fan of my case, which is already admitedly limited.

2) The way Nvidia implemented this solution, its like they made sure this card was a thermal nightmare, there is no effective cooling solution available for these cards besides the stock, and the stock fan doesn't do much, espcially the one sandwiched between the two PCBs...

3) A moot point, but its ugly, nuff said.

4) Another moot point, waste of natural resources to use that much extra PCB, when you couldve done it one.

So yea, there, at least I backed it up nice and clear, I bet you thought you asked a rhetorical question too...


R600
By Enron on 6/26/2006 1:55:06 PM , Rating: 2
R600 is what I'm wating for. Gemini means nothing to me.




RE: R600
By pyrosity on 6/26/2006 5:54:59 PM , Rating: 1
R600...for the DX10 support, I would guess.

For as big of a fuss there was about cards needing SM3 support, arguments for DX10 support may carry more weight if the whole "unified shader" concept turns out to be what we have heard it to be.

As tasty as these new cards sound, I can't afford anything other than something that will last for a long time.


"Google fired a shot heard 'round the world, and now a second American company has answered the call to defend the rights of the Chinese people." -- Rep. Christopher H. Smith (R-N.J.)

Related Articles
ATI: 80nm ASIC Production OK
June 16, 2006, 9:50 AM
ATI GPU 2006 Roadmap
June 6, 2006, 3:20 PM
Forget DVI, DMS59 Is Back Again
February 27, 2006, 4:30 AM













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki