backtop


Print 70 comment(s) - last by Trisped.. on Jun 8 at 12:52 PM

ATI's announces the latest perk of CrossFire

Two's company, three's a crowd -- that is a saying around these parts. ATI is looking to make "three" the magic number when it comes to physics on desktop PCs. ATI today announced at Computex an asymmetric CrossFire configuration that allows gamers to pair two graphics cards in a traditional CrossFire mult-GPU setup with a third graphics card dedicated solely to handling physics. This would also explain why ATI has been winking and nodding for manufacturers to include three physical 16x slots on their new motherboards according to our well-placed sources.

This opens up a whole new level of possibilities when it comes not only to physics in current and future games, but also has the potential to shape how a gamer chooses to upgrade his or her rig. Take for example a gamer that is using a single Radeon X1600 Pro graphics card right now and decides that they want to kick it up few notches and go with dual Radeon X1900 class graphics cards in CrossFire mode for maximum performance. Instead of simply tossing the Radeon X1600 Pro aside to collect dust in a corner somewhere or selling it for much less than you paid for it, you can now (if you motherboard supports it) use that “odd man out” to do some actual work.

ATI is of course thrilled with the possibilities that this opens up for gamers (along with the possibility of gamers going out to buy yet another ATI-based graphics card) and is throwing more resources into its CrossFire certification program. This latest move in CrossFire physics is an intriguing solution and one that could be quite a bit cheaper than a dedicated physics solution for gamers who already are packing dual ATI graphics cards. Here's a statement from ATI's press release:

"The addition of physics to the CrossFire platform, and the continuing evolution of CrossFire is based directly on the feedback of hardcore gamers - CrossFire is not ATI's platform, it's gamers' platform," said Godfrey Cheng, Director of Marketing, Platform Technologies, ATI Technologies Inc., responsible for ATI's CrossFire strategy. "Asymmetrical physics support, broader certification, and untouchable overclockability are a direct result of gamers' input. CrossFire will continue to evolve to be more open, flexible and easy to use without sacrificing performance, and it starts with boundless gaming."



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Two problems I see...
By segagenesis on 6/6/2006 8:18:04 AM , Rating: 3
For starters, why not just copy nVidia and make a stacked double card for the first slot and then give the option for the 2nd slot being useful for other tasks? Having 3 graphics slots on a board rather than 2 or 4 makes about as much sense as demanding Microchannel slots. Why not just have the 2nd card do physics in a 2 slot configuration?

Given the Phys-X and its technically important yet rather unimpressive debut, is all this fuss about accelerating physics really that big of a deal? I can only keep thinking back to 1995 and the year of the 3D decelerator when having hardware 3D was slower than software.




RE: Two problems I see...
By phatboye on 6/6/2006 8:34:02 AM , Rating: 3
The 3 slot configuration is for people who upgrade their computers down the line like he said in the summery above. Yes putting two GPUs on a single card would be great but not everyone could afford such a card. Instead what ATI is suggesting is that customers could buy one card at a time until they have 3 cards in total, two dedicated for GFX processing and one dedicated for physics processing.

Your solution would be better though if GPUs weren't so expensive.

as far as your second point is concerned, this is AGEIA's first product so don't expect it to be great. As time goes by and the AGEIA engineers get more time to perfect their product I'm sure their PPUs will get substantially better.


RE: Two problems I see...
By goku on 6/6/2006 1:36:53 PM , Rating: 2
Actually no, his idea really DOES make sense. If you've got the money for two X1900XTX etc.. then they should just allow the consumer to just stick with dual slot motherboards, turn that old X1600 into a phyiscs card, and then purchase a dual X1900XTX card instead of buying two cards so that you need three slots and a redesigned motherboard. Sticking with two slots is a better idea, just use the second slot for the physics card and the first slot for a dual video card. If they do get three slots boards, it would be for quad card crossfire+physics. Eitherway I'd still rather have a company like ageia become successful with their physics card as it's not easy to pioneer in a technology like this, plus I don't have to buy another motherboard just to get physics which I would have to if I were to go with the ATI idea. Physics card+PCI/1X/2XPCIe slot IMO makes more sense, you don't really need all that bandwidth, it's not like they're textures or anything..


RE: Two problems I see...
By randomlinh on 6/6/2006 8:45:58 AM , Rating: 2
hey, w/ 3x16xPCI-e... we can have sextuple cores driving our games!

The power requirements of video cards have become insane very quickly


RE: Two problems I see...
By Master Kenobi (blog) on 6/6/2006 8:57:47 AM , Rating: 2
Yea, can anyone give me a figure of how much ampage I would need on my 12v Rails to sustain this kind of monster? I know the current crossfire needs around 35A or 38A on the 12v Rails, now add in a third?......... 45-50A?........................... Think I'm gonna have to pass, I don't know of ANY single PSU that has that much ampage. Might be time to look into that second mini-PSU for gfx cards.......


RE: Two problems I see...
By Hypernova on 6/6/2006 9:47:40 AM , Rating: 2
Can't help but feel sorry for the power supply designers, their work is gonna be a living hell come next year.


RE: Two problems I see...
By peternelson on 6/6/2006 1:19:58 PM , Rating: 2

As a recent anandtech story pointed out, the next generation graphics cards will take even more power.

They said that psu makers were already working on 1000W, 1200W maybe 1400W to supply the needs of such monsters.

Therefore 3 current gen video cards should be powerable off those new psus when they come.


RE: Two problems I see...
By rrsurfer1 on 6/6/2006 10:04:27 AM , Rating: 2
I can't believe what some of the power requirements are. Competition certainly is causing ATI/NVIDIA to increase performance at any cost.

Just FYI - current is the correct word to use rather than "ampage". Current is measured in amps, and ampage sounds like voltage, so this is a common mistake.


RE: Two problems I see...
By schwinn8 on 6/6/2006 3:31:15 PM , Rating: 2
"ampage" is certainly wrong... but it doesn't have to be "current"... amperage is certainly a viable term, and a real word.


RE: Two problems I see...
By Thrawn on 6/6/2006 10:57:52 AM , Rating: 2
PC Power and Cooling makes a 1 Kilowatt PSU that could handle it as it supports 66A but it costs $500 so I don't think it is something that many people will use. I think their 850W unit could also barely support the setup.


RE: Two problems I see...
By ET on 6/6/2006 12:00:04 PM , Rating: 2
Frankly, the 3D decelerator was still the thing that impressed me most, for some reason. I'd seen Quake on SGI machines, and later I loved the Unreal graphics (including when running at 320x200 at 9fps on a Rage II+), but nothing to me was as impressive as seeing Tomb Raider bilinear filtered on a ViRGE.


RE: Two problems I see...
By ViRGE on 6/6/2006 4:47:10 PM , Rating: 2
I like your thinking.


RE: Two problems I see...
By fenderkb76 on 6/6/2006 6:54:09 PM , Rating: 2
The problem with two cores on one card for the purposes of SLI /Crossfire on one card is that it basically borrows PCIe lanes from the 2nd PCIe slot. I'm not sure that you would have full bandwidth on the empty PCIe slot. Some of the lanes would be used wouldn't they. I believe this will also depend on the chipset involved...dual 8x PCIe vs. dual 16 PCIe


MM .. Interesting
By tuteja1986 on 6/6/2006 8:09:51 AM , Rating: 3
Now , I hope Nvidia does kind of the same so the developers would support it.




RE: MM .. Interesting
By phatboye on 6/6/2006 8:25:23 AM , Rating: 2
With 2 PCI-e slots used for Crossfire/SLI and one slot used for a PPU there will not be much room left for expanding most atx motherboards. I think there needs to be a new standard that addresses this growing problem.

But Yes I must agree with the previous poster, this application of ATI's GPUs into a PPU is an excelent idea. I hope that NV does something similar with its GPUs so that developers would support such configurations.

With ATI throwing their hat into the PPU segment AGEIA better step up it's game because ATI will crush AGEIA if they don't. It's always good to see competition and but I can't help but worry about the little guy (AGEIA) being able to keep pace with a well known, well established company such as ATI.


RE: MM .. Interesting
By phatboye on 6/6/2006 9:30:20 AM , Rating: 2
Oh yeah I also forgot about the problem of how much power the PSU will have to supply in order to keep such a beast running. Both ATI and NV is going to have to start to make there GPUs run more efficiently with respect to how electrical usage.

If Intel, of all companies, can make energy efficient processors than anyone can do it :o


RE: MM .. Interesting
By BigLan on 6/6/2006 2:39:32 PM , Rating: 2
Yeah, 2x dual-slot video cards + Sound Card is already 5 slots filled. Toss in another video card (If the manufacturer gets the slot positions right) and you're pretty much full.


RE: MM .. Interesting
By ET on 6/6/2006 11:54:13 AM , Rating: 2
NVIDIA supported it first. At GDC, HavokFX was shown on NVIDIA hardware. (Maybe even at the NVIDIA booth, I don't really remember.) They said then that it'd run on ATI, too, but now I guess it's officially supported by both ATI and NVIDIA.


RE: MM .. Interesting
By Trisped on 6/6/2006 4:05:28 PM , Rating: 2
HavokFX can run on pretty much any video card that supports shader 2.0

ATI's solution is a bit different, building the ability strait into the card. The result is a possibly faster system, as the cards can be optimized to talk to each other across the PCIe bus. ATI also supported a system where down time on the video card could be used to do physics. If that were true I wonder if you could do the same thing here, just using 3 cards with one dedicated to physics, one to graphics, and one to do both as needed.


RE: MM .. Interesting
By ET on 6/7/2006 6:48:31 AM , Rating: 2
Where have you seen ATI mentioning its own solution? The press release talked purely of HavokFX.


RE: MM .. Interesting
By Trisped on 6/7/2006 12:37:02 PM , Rating: 2
http://www.dailytech.com/article.aspx?newsid=1414
http://www.pcper.com/article.php?aid=226&type=expe...

As an ATI fan I can understand it is hard to keep track of the competition. As noted in the NVIDIA press release, their physics engine requires an SLI set up, with one card devoted to physics and the other to graphics.

If you read the above articles it talks about ATI's plan to allow single and crossfire solutions that do both graphics and physics at the same time. ATI explains that the GPU can do this during down time, and if you saw the chart from the SM 3.0 article it is easy to see that there is a reasonable amount of free GPU time while rendering a frame.


3 cards?
By poohbear on 6/6/2006 8:59:37 AM , Rating: 2
are'nt they being a lil impractical w/ 3 cards? whos gonna buy 3 cards if ppl arent even convinced to buy 2?!




RE: 3 cards?
By Griswold on 6/6/2006 9:13:15 AM , Rating: 2
Not many. But did you read the article? They market this as an upgrade path. Use your old and rusty vid card instead of throwing it away when you make the move to CF.

But thats about the only use I can see for this.


RE: 3 cards?
By Cincybeck on 6/6/2006 9:23:18 AM , Rating: 1
It's not a viable upgrade path either at the moment when you factor in that no motherboard has three slots. So you're looking at spending another $220 on a new motherboard.


RE: 3 cards?
By Griswold on 6/6/2006 9:57:21 AM , Rating: 2
No not at the moment, but in a couple months. People will buy new mobos (also because of core2 and am2) and if that board has 3 slots - voila, there is your "upgrade" path for the future. Not like anyone needs any physics acceleration right now anyway, with the few games that may or may not support it.


RE: 3 cards?
By phatboye on 6/6/2006 10:17:43 AM , Rating: 2
RE: 3 cards?
By Goty on 6/6/2006 11:11:40 AM , Rating: 2
Actually, that's really EITHER one x16 slot, or two x8s, they don't all work at the same time.


RE: 3 cards?
By Trisped on 6/7/2006 12:45:15 PM , Rating: 2
There are many people with more money then brains that like to upgrade their PCs once a year. These people would need to upgrade the mother boards anyways to get the newest processors, so there is the mother board. Plus, it allows for strange combinations. Currently I have an AIW card, so I can't upgrade without buying a new TV tuner as well as video card. With this new mother board I could keep the AIW, use it as another display device, TV tuner, and physics processor; all while being able to upgrade to the latest and greatest Crossfire setup.


Yeah, serious power!
By Arctucas on 6/6/2006 9:50:28 AM , Rating: 2
As far as power requirements, you would probably need a PCP&C 1KW PSU (70A max on 12V rails).




What's wrong with you guys
By drank12quartsstrohsbeer on 6/6/2006 10:02:46 AM , Rating: 3
With all your moaning. Just because a company introduces a product, it doesn't mean you are obligated to BUY it, does it?
The market will decide if these are worthy products.


RE: What's wrong with you guys
By fungry on 6/6/2006 10:17:04 AM , Rating: 2
true, its all up to the consumer and their decisions. The market will sure tell whether these products are up to their standards. In addition, a 1KW is quite, horrendous. Were at the age where probably a 650W is considered WoW/Damn/Holy lolz.


By segagenesis on 6/6/2006 10:34:03 AM , Rating: 2
It's a legitimate gripe when CPU power is increasing and there is focus on lessening the total power dissipation at the same time. For graphics card manufacturers this seems to not be a concern... electricity is not free.


RE: What's wrong with you guys
By Trisped on 6/7/2006 12:51:54 PM , Rating: 2
In a sense, we are trying to turn the market to offer more energy efficient computers rather then the supper-uber-mega-power eaters that we are creating now. It has already started in the CPU market, but it seems to be an out cropping of the laptop market into low end CPUs. The FX and EE versions are still taking more power then their predecessors.


All That for the low low price of......
By SpaceRanger on 6/6/2006 10:12:27 AM , Rating: 2
thousands of dollars!!! Yes.... And the wife is gonna wonder why the lights dim in the house every time you fire up your PC.




RE: All That for the low low price of......
By fungry on 6/6/2006 10:17:56 AM , Rating: 2
haha yea! Dimming lights. Happens when i turn on this power sucking air-conditioning appliance...


RE: All That for the low low price of......
By hwhacker on 6/6/2006 10:33:25 AM , Rating: 3
It should be noted along-side the 2+1 config that has everyone's panties in a bunch, they showed a 1+1 config being possible. This, in my opinion, seems the most feasible.

Instead of ditching your x1600/x1700/x1800/x1900 (although i'd imagine you'd want to sell your x1900) when you grab your fresh new R600, just throw it in the second slot for some physics. It's especially a good option if you grab a card like a x1600/x1700/x1800gto for now to carry you over until then, as they're not expensive but should yield, if ATi is to be believed, better results than an equally-priced PPU in physics, while being dually useful then and now.

I do agree on two points others have made:
1. ATi should concentrate harder on creating a dual-gpu card, like the one Nvidia currently just released, so we don't need 3 different 16x slots, or we have more room for expansion of lesser things like a soundcard, etc on pci-e.
2. I hope Havok's API can mature to be on-par with what Aegia's can SUPPOSEDLY do with interaction as opposed to being passive. As a previous poster said, that does not yet seem to be the case. The fact Nvidia and ATi are both using Havok though, i'm sure it will grow...and it may just explode in Aegia's face...Perhaps even forcing them to abandon hardware and going more Havok's route, software, if they want their (perhaps superior) tech to survive. It's already been shown Aegia's tech can used without a ppu (supposedly on PS3), so I think it's a definate possibility.

Oofta. I guess we'll see.


By abhaxus on 6/6/2006 3:45:07 PM , Rating: 2
I wonder if it will be possible for people that have the Asrock 939Dual like myself... I've got an ATI X800XL in my AGP slot and waiting for a PCIe card to get really excited about. Would be awesome if I could keep my X800 in there for physics. Guess its probably not very likely.


By Trisped on 6/7/2006 12:58:32 PM , Rating: 2
I am praying that ATI does not release a two cards connected to each other solution like NVIDIA's 7950GTX2. It is a sign of a lazy developer. Sure, NVIDIA is trying to capitalize off its SLI name, but you don't get 2x performance from running 2 cards in SLI. You would be much better off building the 2 GPUs into 1 with twice the hardware. You could put twice the RAM on it, since each card would not need its own set, you would have the same number of shaders, since you have them from both. Latency between cards wouldn't be a problem. Over all, you would get something much closer to 2x then any SLI or Crossfire setup will ever be.


How about something for normal gamers
By bigboxes on 6/6/2006 10:39:45 AM , Rating: 4
Most peeps don't use SLI or Crossfire. We might have one good card that adequately plays games. When it starts chugging as we try out the latest games, then we reluctantly upgrade to something affordable. (Read: <$200) I think something more reasonable would be to use the older card when we upgrade for as a physics card. The newer card would reside in the first x16 slot and the retired card would reside in the second x16 slot. Two cards, not three. Sheesh, these video card makers and their pipe dreams.




RE: How about something for normal gamers
By hwhacker on 6/6/2006 10:43:47 AM , Rating: 2
If you read around the web this morning, ATi says they will support exactly what you just mentioned. The 3-card option is just getting all the attention...Because it's so damn crazy. :P


RE: How about something for normal gamers
By michal1980 on 6/6/06, Rating: -1
RE: How about something for normal gamers
By Motley on 6/6/2006 1:54:06 PM , Rating: 2
Except that in your case, where you upgrade your 7800gt to a 7900gt, you could now put the 7800gt in the second slot and use it for physics.

When you decide to upgrade your 7900gt to a 7900gtx2, you can throw away the 7800gt, and use the 7900gt for physics. I really don't see a problem with this. Those of us that upgrade regularly can make better use of our last-gen video cards.


By hoppa on 6/6/2006 1:57:11 PM , Rating: 2
quote:
Except that in your case, where you upgrade your 7800gt to a 7900gt, you could now put the 7800gt in the second slot and use it for physics.


Even better, he could sell it ;)


By Trisped on 6/7/2006 1:40:41 PM , Rating: 2
quote:
Except that in your case, where you upgrade your 7800gt to a 7900gt, you could now put the 7800gt in the second slot and use it for physics.
So casual readers don't get confused:
This is just an analogy, NVIDIA does not really support this type of use.
The ATI analogy would be:
You bought the x1800, then upgraded to the x1900 with the x1800 doing physics, then bought the x1950 with the x1900 doing physics and sold the x1800 on ebay.


Impractical
By Cincybeck on 6/6/2006 9:05:18 AM , Rating: 1
I honestly think a dedicated PPU, designed for advanced physic calculations is a better solution. From what I've read GPU's won't be able to do the more advanced processing down the road when physics effect the game play. Right now they're only being used to process visual effects that don't affect the game play in a fundamental way. I'm just worried if the solutions from ATi and Nvidia take hold, it will hinder the advancement of realism down the road.

Not to mention the absurd amount of power it would take to power another GPU...




RE: Impractical
By bamacre on 6/6/2006 12:34:38 PM , Rating: 2
I agree. And I don't understand why physics can't be done with the processor via means of a second core. Upcoming games, like Crysis, should be multi-threaded. That should be plenty of cpu power to handle the physics as well as everything else.


RE: Impractical
By Scabies on 6/6/2006 1:09:57 PM , Rating: 2
That would make sense, not that we're all into logic these days..
But currently all we have dual cores doing is one manages background crap like virusscan and the kernel, and the other manages active processes. Why not task a core with physics? (save a PCIx slot and considerable power, and ups the CPU efficiency [less clocks go to waste])


RE: Impractical
By rrsurfer1 on 6/6/2006 4:14:46 PM , Rating: 2
Physics cores are *supposed* to feature specialized architecture to perform physics calculation much more efficiently (much in the same way that graphics cores are more suitable for graphical calculations).

However, that said, bandwidth and delays propagated through the bus certainly hurt physics performance, as illustrated by the PhysX's "stutter" issue when doing heavy physics.

I think some real promise lies in cache coherency with the CPU as AMD is persuing.


RE: Impractical
By Trisped on 6/6/2006 4:25:08 PM , Rating: 1
The problem is that the CPU is designed to do a wide range of tasks, though only 1-2 things at the same time (excluding pipe lines). A GPU and physics co processor are designed to do a large number of simple tasks simultaneously. An analogy would be a man named CPU has a PDA cell phone. He can use the phone to call, keep track of appointments, listen to music, and everything else a mobile warrior would need. Another man, Physics, has a cell phone, mp3 player, and PDA. He can do everything the man with the PDA phone can do, but he can do it all at the same time.

Physics is really a bunch of small, simple calculations. The calculations need to be done quickly, so the depended equations have the information that they need. Physics are also stupidly threadable (like graphics) which means many cores in one unit will be put to good use. CPUs are not designed for this.

Additional reading would include pipe lines, threading, and CPU/GPU structure.


RE: Impractical
By Trisped on 6/7/2006 12:48:41 PM , Rating: 2
Upcoming GPUs will be able to take shader results and send them back to the CPU, which will fix the game play dependencies. This may even be a side effect of SM 3.0 where shaders can write the results back to GPU memory.


This is crazy !
By icered on 6/6/2006 10:16:28 AM , Rating: 2
Years ago I had to try and save every penny to get myself a good graphics card to play the new games that come out at acceptable framerates. When they brought out SLI and crossfire, i thought they were going overboard for an increase in performance. Now another card just for physics?.. This is NUTS!!.

They should be integrating everything into a single card.
An analogy would be like having a separate processor for FPU operations besides the primary CPU(If AMD has its way this too seems like another likely prospect). Seriously buying several components for gaming that exceeds the cost of the rest of your PC by several times over doesn't seem to be a sensible proposition. But then again this is just my point of view.




RE: This is crazy !
By hwhacker on 6/6/2006 10:41:46 AM , Rating: 2
^ if rumors are to be believed, both companies are dabbling in that now (keep old card, change the gpu in said card). It's not inprobable to believe if that does happen, there will be dual socket cards like this, allowing for two different ATi (or Nvidia) cores to be used...One for rendering, one for physics.

There's also AMD's new Torrenza, which perhaps ATi or nvidia could release a part for. That surely would be more of a niche product though, as it would leave all Intel users out.

I think we're definately seeing a transformation coming in this and other areas, and by the end 2008 into 2009 we'll start to see the changes actually taking shape.


RE: This is crazy !
By Trisped on 6/7/2006 1:04:19 PM , Rating: 2
ATI does have a single card solution that does both GPU and physics.

As far as I saw, AMD is pushing the co-processor in the server market where they do a lot of math functions. In reality though, having multi-core math coprocessors is what the physics card is, so the product is already sold.

And flip chip GPUs don't seem likely due to the limited number of constants found on a video card board. Only the people that upgrade ever year would benefit from it, and then only by a small amount.


Perhaps a change is required
By Sceptor on 6/6/2006 10:26:04 AM , Rating: 2
This is getting absurd...what these companies don't realize is that power costs $$$. Maybe they should design a standardized card layout with interchangeable GPU cores and memory slots to upgrade.

This way you could add a new GPU core (even physic co-pro) if required and save us from buying 3+ video cards.

ATI and Nvidia take the hint from AMD...conserve power.




RE: Perhaps a change is required
By josmala on 6/6/2006 11:30:31 AM , Rating: 2
And run it with quarter of memory bandwith available todays graphics cards.
Thats the trade off for what you propose.
The socketing and slotting doesn't come free firstly it increses initial costs, 2ndly it makes electrical design harder.
The 2nd issue put along with 128bit as maximum reasonable memory interface for such a card would result somewhere along the lines of redusing the memory bandwith a lot.


RE: Perhaps a change is required
By Trisped on 6/7/2006 1:27:17 PM , Rating: 2
It is not the manufactures that don't realize that power cost $$$, it is the consumers. Everyone puts so much pressure on being the company with the fastest card that both NVIDIA and ATI have to dump excessive amounts of money into creating a card that is just a little more powerful then their competitor.

GPU flip chips have been discussed in the past, but the increase in cost, loss of efficiency, and the fact that each GPU revision requires a new board and interface makes this idea not effective.


ATI... focus on core competence please.
By lemonadesoda on 6/6/2006 12:02:02 PM , Rating: 2
ATI does not need to create a dual GPU card. The whole concept of their memory/bus "ring" is that they can just add more shader/vertex units to the ring. The ATI core is designed to scale.

Crossfire is just an elastic band between two physical slots and two "rings". Its an interim solution, if not a copy-cat reaction to the press nVidia got with their SLi concept.

In fact, from what we understand, ATi had the technology a long time but saw no viable commercial application for it, since it was always more efficient to just produce the "next gen" GPU, i.e. 9500 over 8500, 9700 over 9500. x800 over 9800. x1800 over x800 etc. Each of these generational steps has been MORE EFFECTIVE than just doubling the previous GPU power via Crossfire/SLi methods.

Far more efficient, elegant, and cheaper to produce with the benefit of lower power consumption is to add more processing units, and each unit more efficient. This would however require increased die size.

No need for crossfire. No need for dual PCIex16. No need to duplicate the GDR3 memory on each card (where in practice, the same, data/textures are sotred on both cards at the same time). No need for the complex and expensive mainboards. No need for large cases with multiple slots. No need for gigawatt PSUs. No need for airconditioning systems. No need for ear-plugs.

Just one nice big 64 vertex/shader processor unit GPU. Then 96. Then 128.

ATI should focus on developing this silicon, then focus on reducing volt/power requirements.

Simple.

Why are they wasting so much time, money and editorial resource doing this silly 3 GPU business? Is it smoke and mirrors? Just to grab some headlines. Free advertising. REALLY worried about AGEIA? Or what AGEIA might achieve in thier v2.0?




By Trisped on 6/7/2006 1:51:56 PM , Rating: 2
They are doing it so they can stay one step ahead of NVIDIA, which is why both companies do anything innovative.

There is also the PCIe advantage being created. For a long time the only PCIe cards have been graphics cards. Part of this is because most boards only have slots available for graphics, with the extras covered over by the HSF of the GPU card. This way you are guarantied 1 extra slot will be open, there buy encouraging manufactures to produce for hardware for PCIe instead of just PCI. Besides, isn't the goal to have 6-7 PCIe 16x slots?


By Trisped on 6/7/2006 1:56:30 PM , Rating: 2
I think the idea is coloser to "I own an x1800 that I bought right when it came out. I plan on owning the top of the line R600. Rather then sell my x1800 it would be nice to use it as a physics processor."


Power..
By trintron on 6/7/2006 2:55:28 PM , Rating: 2
Two cards are too much, three is overkill.




RE: Power..
By Trisped on 6/8/2006 12:52:58 PM , Rating: 2
Don't buy three, buy 1. In six months to a year buy another. This may sound familiar, as it is what NVIDIA said about SLI when it first came out. Then it turned out to be false. For the same amount of money as buying a second card to complete your SLI you could buy a new card that was twice as fast. Hilarious. All these people forked out $300+ for an SLI MBoard and extra $50-100 for a SLI version of their video card, only to not be able to get a cost effective upgrade path in the future.

Enter ATI, rather then just trying to sell more, they try to sell smarter and encourage loyalty. You have a x1600 that works great right now? Good. In a year or two are you planning to upgrade to a new card? Even better. When you do just pick up a new Mother board that as 2+ 16x slots and you can keep that old 1600 for physics and use the new card for graphics. Have an old AIW that you want to keep for the TV tuner but don't want to fork out the extra cash for the AIW version of the new card you want? Just stick the AIW in the second slot and use the new card as a primary graphics card. Even better, you can still use the AIW for physics and TV watching.


Short the stock
By lemonadesoda on 6/6/2006 6:55:12 PM , Rating: 1
Based on the general consensus of the postings here, I've decided to sell ATi stock.




RE: Short the stock
By Trisped on 6/7/2006 1:53:46 PM , Rating: 2
You are not Forbes, you can't manufacture stock trends that easily. :)


Full Towers
By xKelemvor on 6/6/2006 9:39:58 AM , Rating: 2
Going to be time everyone has to have a Full Tower case and huge Mobos because of all the expansion slots people are going to need soon. 3 slots just for the video cards? Sheesh. Hope everything else is onboard...




stupid engineering
By ninja123 on 6/6/2006 11:58:02 AM , Rating: 2
This display card industry is starting to go nuts, havent they learned the lesson from the microprocessor industry. Perhaps what the ageia guy should do is design a low power GPU, so NVIDIA and ATI can kiss their ass bye bye. (Something that almost happened to Intel if werent for their core microarchitecture ) Running a 1kw PSU for my PC, is like running an air conditioning unit.




unrealistic
By hoppa on 6/6/2006 1:54:31 PM , Rating: 2
quote:
. Take for example a gamer that is using a single Radeon X1600 Pro graphics card right now and decides that they want to kick it up few notches and go with dual Radeon X1900 class graphics cards in CrossFire mode for maximum performance.


So what you are saying is, some gamer that conservatively (or, realistically) spent $175 on a very decent midrange card, suddenly decides to go ape-shit wild and drop $600 on new cards, $100+ on a new mobo, $100 on a new PS, and not even sell his old gear, just so he can get better frames where he didn't even need them? Not realistic at all.

I'm glad nVidia and ATi enable us to get these uber-ridiculous setups, with previously 2 and now 3 graphics cards working together in tandem, but honestly, how many people actually go near any of this stuff? Especially with power consumption the way it is, the premium for 2 cards rather than 1 is ridiculous. If I had to choose between adding on another 6x00 or just buying a 6(x+3)00 and selling the old, I'd opt for the new card.




second core
By kitchme on 6/6/2006 1:57:13 PM , Rating: 2
With most games running on one CPU core, then why not try to harness power of that second core (for those with Dual Core) to run physics calculations. Would that be possible? And I would now for sure what to do with my 'aging' card. There's always a system waiting to be built.




By Tanclearas on 6/6/2006 4:14:11 PM , Rating: 2
I guess I just don't understand what is "new" about this announcement. What exactly was revealed that we didn't already know? We already knew ATI (actually, Havok) was planning this, so is it available right now? Are there games that support it? Do existing Havok-engine-based games benefit in any way right now? Are there updated drivers available? As far as I can tell, the answer to all of those questions is "no".

I fail to understand why this announcement is generating so much reaction when it doesn't look like there is any new information, or any ability to enable this capability. What's next? An "announcement" by Microsoft that there's going to be a new version of Windows?




Specs?
By cparker15 on 6/7/2006 12:05:40 PM , Rating: 2
This is fantastic news! Even though it's not new news, it's still great! Progress is being made every day!

Now, how about ATI releasing their specs so developers of free software can write free drivers to enable all of this to work on free operating systems, such as GNU (with Linux, Hurd, et al.) or *BSD?




stupid example
By bunnyfubbles on 6/6/2006 10:43:31 AM , Rating: 1
Someone who goes a midrange route (such as X1600Pro) isn't going to suddenly decide to go dual X1900. And if they do, they should probably have the money to buy 3 X1900s to have the fastest setup possible.

As much as I like ATI, this stinks of gimmick all over.

As it stands the X1600Pro is no beast of a video card, I fail to believe it would make a decent physics card.

And what’s more, is the emphasis on dual card setups keeping the physics process in the 3rd leg spot.

I'd much rather pay for one fast video card/CPU and PPU than waste money on an SLI/Xfire setup. Sure, SLI/Xfire makes a whole lot more sense today than jumping on the physics bandwagon, however I don't see this holding true in the future, not by a long shot.

Although it isn’t all bad; even though ATI is still promoting video cards first, they’re at least admitting to a future with physics, and not some crap onboard solution.




"I mean, if you wanna break down someone's door, why don't you start with AT&T, for God sakes? They make your amazing phone unusable as a phone!" -- Jon Stewart on Apple and the iPhone











botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki